=============================================================================== About this build: this rebuild has been done as part of reproduce.debian.net where we aim to reproduce Debian binary packages distributed via ftp.debian.org, by rebuilding using the exact same packages as the original build on the buildds, as described in the relevant .buildinfo file from buildinfos.debian.net. For more information please go to https://reproduce.debian.net or join #debian-reproducible on irc.debian.org =============================================================================== Preparing download of sources for /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1_riscv64.buildinfo Source: slepc Version: 3.24.2+dfsg1-1 rebuilderd-worker node: riscv64-33 +------------------------------------------------------------------------------+ | Downloading sources Sat, 31 Jan 2026 12:50:13 +0000 | +------------------------------------------------------------------------------+ Get:1 https://deb.debian.org/debian trixie InRelease [140 kB] Get:2 https://deb.debian.org/debian-security trixie-security InRelease [43.4 kB] Get:3 https://deb.debian.org/debian trixie-updates InRelease [47.3 kB] Get:4 https://deb.debian.org/debian trixie-proposed-updates InRelease [57.6 kB] Get:5 https://deb.debian.org/debian trixie-backports InRelease [54.0 kB] Get:6 https://deb.debian.org/debian forky InRelease [137 kB] Get:7 https://deb.debian.org/debian sid InRelease [187 kB] Get:8 https://deb.debian.org/debian experimental InRelease [91.3 kB] Get:9 https://deb.debian.org/debian trixie/main Sources [10.5 MB] Get:10 https://deb.debian.org/debian trixie/non-free-firmware Sources [6,552 B] Get:11 https://deb.debian.org/debian-security trixie-security/main Sources [120 kB] Get:12 https://deb.debian.org/debian-security trixie-security/non-free-firmware Sources [696 B] Get:13 https://deb.debian.org/debian trixie-updates/main Sources [2,788 B] Get:14 https://deb.debian.org/debian trixie-proposed-updates/main Sources [47.5 kB] Get:15 https://deb.debian.org/debian trixie-backports/non-free-firmware Sources [2,468 B] Get:16 https://deb.debian.org/debian trixie-backports/main Sources [130 kB] Get:17 https://deb.debian.org/debian forky/non-free-firmware Sources [7,700 B] Get:18 https://deb.debian.org/debian forky/main Sources [10.5 MB] Get:19 https://deb.debian.org/debian sid/main Sources [11.2 MB] Get:20 https://deb.debian.org/debian sid/non-free-firmware Sources [9,688 B] Get:21 https://deb.debian.org/debian experimental/main Sources [365 kB] Get:22 https://deb.debian.org/debian experimental/non-free-firmware Sources [3,180 B] Fetched 33.7 MB in 11s (2,939 kB/s) Reading package lists... 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.2%2bdfsg1-1.dsc' slepc_3.24.2+dfsg1-1.dsc 3478 SHA256:973bb3c2a36eace73c8da35a18160f6999e93c0cf68c64c6b2dd13f3964f25bd 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.2%2bdfsg1.orig.tar.xz' slepc_3.24.2+dfsg1.orig.tar.xz 23567516 SHA256:8a89011a61d16fe68b092137c751c91c7c2944b5dfb9d79e0fe22043c360ce71 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.2%2bdfsg1-1.debian.tar.xz' slepc_3.24.2+dfsg1-1.debian.tar.xz 21444 SHA256:381141cfdc38cdc79695642b4b3dac17f6f278662a8b48e21ba286c3649e906a 8a89011a61d16fe68b092137c751c91c7c2944b5dfb9d79e0fe22043c360ce71 slepc_3.24.2+dfsg1.orig.tar.xz 381141cfdc38cdc79695642b4b3dac17f6f278662a8b48e21ba286c3649e906a slepc_3.24.2+dfsg1-1.debian.tar.xz 973bb3c2a36eace73c8da35a18160f6999e93c0cf68c64c6b2dd13f3964f25bd slepc_3.24.2+dfsg1-1.dsc +------------------------------------------------------------------------------+ | Calling debrebuild Sat, 31 Jan 2026 12:50:29 +0000 | +------------------------------------------------------------------------------+ Rebuilding slepc=3.24.2+dfsg1-1 in /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs now. + nice /usr/bin/debrebuild --buildresult=/srv/rebuilderd/tmp/rebuilderdnZCy1O/out --builder=sbuild+unshare --cache=/srv/rebuilderd/cache -- /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1_riscv64.buildinfo /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1_riscv64.buildinfo contains a GPG signature which has NOT been validated Using defined Build-Path: /build/reproducible-path/slepc-3.24.2+dfsg1 I: verifying dsc... successful! Get:1 http://deb.debian.org/debian unstable InRelease [187 kB] Get:2 http://snapshot.debian.org/archive/debian/20260126T022502Z sid InRelease [187 kB] Get:3 http://deb.debian.org/debian unstable/main riscv64 Packages [9740 kB] Get:4 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 Packages [9801 kB] Fetched 19.9 MB in 7s (3020 kB/s) Reading package lists... W: http://snapshot.debian.org/archive/debian/20260126T022502Z/dists/sid/InRelease: Loading /etc/apt/trusted.gpg from deprecated option Dir::Etc::Trusted Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 openssl-provider-legacy riscv64 3.5.4-1+b1 [311 kB] Fetched 311 kB in 0s (11.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk9yuw845/openssl-provider-legacy_3.5.4-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 mawk riscv64 1.3.4.20250131-2 [142 kB] Fetched 142 kB in 0s (6179 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9933cbpm/mawk_1.3.4.20250131-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libparpack2t64 riscv64 3.9.1-6+b1 [87.6 kB] Fetched 87.6 kB in 0s (1604 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3ox2tpjg/libparpack2t64_3.9.1-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxdmcp6 riscv64 1:1.1.5-2 [28.2 kB] Fetched 28.2 kB in 0s (534 kB/s) dpkg-name: info: moved 'libxdmcp6_1%3a1.1.5-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp2hu8sy07/libxdmcp6_1.1.5-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libzstd1 riscv64 1.5.7+dfsg-3 [372 kB] Fetched 372 kB in 0s (13.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_jt8cbwb/libzstd1_1.5.7+dfsg-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbrotli-dev riscv64 1.1.0-2+b9 [905 kB] Fetched 905 kB in 0s (7885 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyp70vel3/libbrotli-dev_1.1.0-2+b9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcxsparse4 riscv64 1:7.12.1+dfsg-1 [90.3 kB] Fetched 90.3 kB in 0s (1651 kB/s) dpkg-name: info: moved 'libcxsparse4_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp7q74n5rf/libcxsparse4_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxcb1 riscv64 1.17.0-2+b2 [145 kB] Fetched 145 kB in 0s (2588 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4s8n_nkn/libxcb1_1.17.0-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre64-dev riscv64 3.0.0-5 [5438 kB] Fetched 5438 kB in 0s (24.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkdgkf56u/libhypre64-dev_3.0.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcom-err2 riscv64 1.47.2-3+b8 [24.7 kB] Fetched 24.7 kB in 0s (1206 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcjk8ks5x/libcom-err2_1.47.2-3+b8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcamd3 riscv64 1:7.12.1+dfsg-1 [46.1 kB] Fetched 46.1 kB in 0s (859 kB/s) dpkg-name: info: moved 'libcamd3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpljzxwfba/libcamd3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libucx0 riscv64 1.20.0+ds-4 [899 kB] Fetched 899 kB in 0s (12.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnwddkpmq/libucx0_1.20.0+ds-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmp10 riscv64 2:6.3.0+dfsg-5+b1 [563 kB] Fetched 563 kB in 0s (8428 kB/s) dpkg-name: info: moved 'libgmp10_2%3a6.3.0+dfsg-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpadgj1t_2/libgmp10_6.3.0+dfsg-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcbor0.10 riscv64 0.10.2-2.1 [27.8 kB] Fetched 27.8 kB in 0s (524 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp42ar_gax/libcbor0.10_0.10.2-2.1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfortran-toml-0 riscv64 0.4.3-2 [86.0 kB] Fetched 86.0 kB in 0s (1581 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkh9bqkdr/libfortran-toml-0_0.4.3-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 fortran-fpm riscv64 0.12.0-6 [510 kB] Fetched 510 kB in 0s (7990 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph32fb_jd/fortran-fpm_0.12.0-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarpack2-dev riscv64 3.9.1-6+b1 [201 kB] Fetched 201 kB in 0s (2362 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp96fv56x2/libarpack2-dev_3.9.1-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmunge2 riscv64 0.5.16-1+b1 [20.0 kB] Fetched 20.0 kB in 0s (378 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw0zujzk1/libmunge2_0.5.16-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dh-autoreconf all 21 [12.2 kB] Fetched 12.2 kB in 0s (614 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpc5qb26sb/dh-autoreconf_21_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 comerr-dev riscv64 2.1-1.47.2-3+b8 [60.9 kB] Fetched 60.9 kB in 0s (1129 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqjg7pwrx/comerr-dev_2.1-1.47.2-3+b8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-64i-7.0 riscv64 7.0.10-7 [153 kB] Fetched 153 kB in 0s (2735 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoyxvpx2q/libptscotch-64i-7.0_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librtmp1 riscv64 2.4+20151223.gitfa8646d.1-3+b1 [59.2 kB] Fetched 59.2 kB in 0s (2785 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpz1ugy5y5/librtmp1_2.4+20151223.gitfa8646d.1-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc-15-base riscv64 15.2.0-12 [54.2 kB] Fetched 54.2 kB in 0s (2616 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4c4541yc/gcc-15-base_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 fonts-mathjax all 2.7.9+dfsg-1 [2210 kB] Fetched 2210 kB in 0s (20.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4fujic2u/fonts-mathjax_2.7.9+dfsg-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libatomic1 riscv64 15.2.0-12 [8548 B] Fetched 8548 B in 0s (422 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbfaqjymu/libatomic1_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxdmcp-dev riscv64 1:1.1.5-2 [54.4 kB] Fetched 54.4 kB in 0s (1010 kB/s) dpkg-name: info: moved 'libxdmcp-dev_1%3a1.1.5-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpjk8khj7b/libxdmcp-dev_1.1.5-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libssl3t64 riscv64 3.5.4-1+b1 [2207 kB] Fetched 2207 kB in 0s (33.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphhemsurx/libssl3t64_3.5.4-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxau-dev riscv64 1:1.0.11-1+b1 [28.5 kB] Fetched 28.5 kB in 0s (535 kB/s) dpkg-name: info: moved 'libxau-dev_1%3a1.0.11-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmphlavlfdj/libxau-dev_1.0.11-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dh-python all 7.20260125 [119 kB] Fetched 119 kB in 0s (275 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf183hae3/dh-python_7.20260125_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxau6 riscv64 1:1.0.11-1+b1 [20.7 kB] Fetched 20.7 kB in 0s (390 kB/s) dpkg-name: info: moved 'libxau6_1%3a1.0.11-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpp9o1ztgv/libxau6_1.0.11-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpipeline1 riscv64 1.5.8-2 [40.8 kB] Fetched 40.8 kB in 0s (2004 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpewhfw8rj/libpipeline1_1.5.8-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg-dev riscv64 1:2.1.5-4 [72.2 kB] Fetched 72.2 kB in 0s (1333 kB/s) dpkg-name: info: moved 'libjpeg-dev_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpce4w875b/libjpeg-dev_2.1.5-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbtf2 riscv64 1:7.12.1+dfsg-1 [33.9 kB] Fetched 33.9 kB in 0s (636 kB/s) dpkg-name: info: moved 'libbtf2_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpb17bmn0l/libbtf2_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libklu2 riscv64 1:7.12.1+dfsg-1 [92.7 kB] Fetched 92.7 kB in 0s (1698 kB/s) dpkg-name: info: moved 'libklu2_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpbjaw9hz2/libklu2_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxnvctrl0 riscv64 535.171.04-1+b3 [14.6 kB] Fetched 14.6 kB in 0s (445 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprb_ycx9x/libxnvctrl0_535.171.04-1+b3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libltdl-dev riscv64 2.5.4-9 [193 kB] Fetched 193 kB in 0s (5883 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpky960y33/libltdl-dev_2.5.4-9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libattr1 riscv64 1:2.5.2-3+b1 [23.1 kB] Fetched 23.1 kB in 0s (1150 kB/s) dpkg-name: info: moved 'libattr1_1%3a2.5.2-3+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpa9giv64o/libattr1_2.5.2-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 debconf all 1.5.91 [121 kB] Fetched 121 kB in 0s (5404 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgi6oosi6/debconf_1.5.91_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dash riscv64 0.5.12-12 [101 kB] Fetched 101 kB in 0s (4545 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbfdl44pz/dash_0.5.12-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenmpi40 riscv64 5.0.9-1 [2315 kB] Fetched 2315 kB in 0s (20.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2jqmuazb/libopenmpi40_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran-15-riscv64-linux-gnu riscv64 15.2.0-12 [15.2 MB] Fetched 15.2 MB in 1s (29.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphs76t5v6/gfortran-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libk5crypto3 riscv64 1.22.1-2 [98.4 kB] Fetched 98.4 kB in 0s (4491 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4t9yt404/libk5crypto3_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpkgconf3 riscv64 1.8.1-4+b1 [36.2 kB] Fetched 36.2 kB in 0s (683 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpuexnmxif/libpkgconf3_1.8.1-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkdb5-10t64 riscv64 1.22.1-2 [43.9 kB] Fetched 43.9 kB in 0s (820 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_3x7279x/libkdb5-10t64_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc3.24-dev-common all 3.24.3+dfsg1-1 [315 kB] Fetched 315 kB in 0s (5330 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq1rfxpip/libpetsc3.24-dev-common_3.24.3+dfsg1-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 zlib1g-dev riscv64 1:1.3.dfsg+really1.3.1-1+b2 [995 kB] Fetched 995 kB in 0s (12.3 MB/s) dpkg-name: info: moved 'zlib1g-dev_1%3a1.3.dfsg+really1.3.1-1+b2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp39yowabc/zlib1g-dev_1.3.dfsg+really1.3.1-1+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp2-dev riscv64 1.64.0-1.1+b1 [220 kB] Fetched 220 kB in 0s (3833 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmgdi989m/libnghttp2-dev_1.64.0-1.1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscalapack-openmpi2.2 riscv64 2.2.2-5 [1363 kB] Fetched 1363 kB in 0s (15.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg_uubv7a/libscalapack-openmpi2.2_2.2.2-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgdbm6t64 riscv64 1.26-1+b1 [79.2 kB] Fetched 79.2 kB in 0s (3688 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpc4pmysxi/libgdbm6t64_1.26-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5support0 riscv64 1.22.1-2 [33.7 kB] Fetched 33.7 kB in 0s (1542 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbeog4ugu/libkrb5support0_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcurl4t64 riscv64 8.18.0-2 [418 kB] Fetched 418 kB in 0s (14.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplocfn4pp/libcurl4t64_8.18.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc-real3.24 riscv64 3.24.3+dfsg1-1 [6746 kB] Fetched 6746 kB in 0s (25.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzf_dcm4j/libpetsc-real3.24_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas-dev riscv64 3.12.1-7+b1 [291 kB] Fetched 291 kB in 0s (4885 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw6hbbkaw/libblas-dev_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-dev riscv64 1.16.0-1 [452 kB] Fetched 452 kB in 0s (7281 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsgfdibjx/libngtcp2-dev_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-64-7.0 riscv64 7.0.10-7 [153 kB] Fetched 153 kB in 0s (2737 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeeall7zp/libptscotch-64-7.0_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-0-pthread riscv64 0.3.30+ds-3+b1 [3283 kB] Fetched 3283 kB in 0s (27.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphvrofabw/libopenblas64-0-pthread_0.3.30+ds-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmpxx4ldbl riscv64 2:6.3.0+dfsg-5+b1 [329 kB] Fetched 329 kB in 0s (12.2 MB/s) dpkg-name: info: moved 'libgmpxx4ldbl_2%3a6.3.0+dfsg-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp0jdbzyj5/libgmpxx4ldbl_6.3.0+dfsg-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre-dev riscv64 3.0.0-5 [5578 kB] Fetched 5578 kB in 0s (33.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcyzse8sa/libhypre-dev_3.0.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 opencl-clhpp-headers all 3.0~2025.07.22-1 [51.0 kB] Fetched 51.0 kB in 0s (945 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf2ztz80k/opencl-clhpp-headers_3.0~2025.07.22-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-double3 riscv64 3.3.10-2+b2 [376 kB] Fetched 376 kB in 0s (6251 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo0qwgjwl/libfftw3-double3_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps64-dev riscv64 5.8.1-2 [7344 kB] Fetched 7344 kB in 0s (25.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3kvftj3i/libmumps64-dev_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-cpp-310 riscv64 1.14.6+repack-2 [126 kB] Fetched 126 kB in 0s (2267 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp30n98ac5/libhdf5-openmpi-cpp-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libyaml-dev riscv64 0.2.5-2+b1 [155 kB] Fetched 155 kB in 0s (2775 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp20pbccuy/libyaml-dev_0.2.5-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 base-files riscv64 14 [72.9 kB] Fetched 72.9 kB in 0s (3423 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3_hlw_vz/base-files_14_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libunbound8 riscv64 1.24.2-1 [617 kB] Fetched 617 kB in 0s (9211 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa3dc7drc/libunbound8_1.24.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsasl2-2 riscv64 2.1.28+dfsg1-10 [60.6 kB] Fetched 60.6 kB in 0s (2783 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn1znteeh/libsasl2-2_2.1.28+dfsg1-10_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 build-essential riscv64 12.12 [4628 B] Fetched 4628 B in 0s (234 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2fbuyfnt/build-essential_12.12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-modules-bin riscv64 1.7.0-5+b1 [49.6 kB] Fetched 49.6 kB in 0s (2360 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplc_u3d6b/libpam-modules-bin_1.7.0-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsmartcols1 riscv64 2.41.3-3 [155 kB] Fetched 155 kB in 0s (6764 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0t4k6cgf/libsmartcols1_2.41.3-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 dh-fortran riscv64 0.63 [28.3 kB] Fetched 28.3 kB in 2s (11.7 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0f5h9no1/dh-fortran_0.63_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnl-3-200 riscv64 3.12.0-2 [61.9 kB] Fetched 61.9 kB in 0s (1151 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf42aq8re/libnl-3-200_3.12.0-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libc-bin riscv64 2.42-10+b1 [611 kB] Fetched 611 kB in 0s (18.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3gvicw3i/libc-bin_2.42-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libparpack2-dev riscv64 3.9.1-6+b1 [179 kB] Fetched 179 kB in 0s (3181 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpm6ioh77w/libparpack2-dev_3.9.1-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 zlib1g riscv64 1:1.3.dfsg+really1.3.1-1+b2 [85.6 kB] Fetched 85.6 kB in 0s (3939 kB/s) dpkg-name: info: moved 'zlib1g_1%3a1.3.dfsg+really1.3.1-1+b2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpl6c9_i0w/zlib1g_1.3.dfsg+really1.3.1-1+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openmpi-common all 5.0.9-1 [97.6 kB] Fetched 97.6 kB in 0s (1793 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw2z9b6dt/openmpi-common_5.0.9-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libgprofng0 riscv64 2.45.50.20260119-1 [723 kB] Fetched 723 kB in 0s (19.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpli0eq3uo/libgprofng0_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-dev riscv64 7.0.10-7 [1320 kB] Fetched 1320 kB in 0s (15.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptfb9h4cd/libscotch-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sed riscv64 4.9-2 [329 kB] Fetched 329 kB in 0s (12.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0isrzcxt/sed_4.9-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libbinutils riscv64 2.45.50.20260119-1 [500 kB] Fetched 500 kB in 0s (16.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp64g2lc7p/libbinutils_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 base-passwd riscv64 3.6.8 [54.8 kB] Fetched 54.8 kB in 0s (2628 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdn30exg4/base-passwd_3.6.8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibverbs1 riscv64 61.0-2 [64.2 kB] Fetched 64.2 kB in 0s (1195 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqgwndgh6/libibverbs1_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libisl23 riscv64 0.27-1+b1 [664 kB] Fetched 664 kB in 0s (9844 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0bf1tn14/libisl23_0.27-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas64-3 riscv64 3.12.1-7+b1 [116 kB] Fetched 116 kB in 0s (2105 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo7hf6ryy/libblas64-3_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmpc3 riscv64 1.3.1-2+b1 [56.8 kB] Fetched 56.8 kB in 0s (2722 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp07mnr6v0/libmpc3_1.3.1-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnl-route-3-200 riscv64 3.12.0-2 [200 kB] Fetched 200 kB in 0s (3527 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpazo0rc7o/libnl-route-3-200_3.12.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu-dist9 riscv64 9.2.1+dfsg1-1 [707 kB] Fetched 707 kB in 0s (10.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpumtvcxmg/libsuperlu-dist9_9.2.1+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnl-3-dev riscv64 3.12.0-2 [191 kB] Fetched 191 kB in 0s (3369 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5431xfge/libnl-3-dev_3.12.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkadm5srv-mit12 riscv64 1.22.1-2 [55.8 kB] Fetched 55.8 kB in 0s (1022 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsovquxfx/libkadm5srv-mit12_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 login.defs all 1:4.19.0-4 [211 kB] Fetched 211 kB in 0s (3648 kB/s) dpkg-name: info: moved 'login.defs_1%3a4.19.0-4_all.deb' to '/srv/rebuilderd/tmp/tmpnc3cvn0q/login.defs_4.19.0-4_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 x11proto-dev all 2024.1-1 [603 kB] Fetched 603 kB in 0s (9211 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpi4m3kt9x/x11proto-dev_2024.1-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Fetched 22.9 kB in 0s (1135 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyfrm0trf/intltool-debian_0.35.0+20060710.6_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openssh-client riscv64 1:10.2p1-3 [1016 kB] Fetched 1016 kB in 0s (12.5 MB/s) dpkg-name: info: moved 'openssh-client_1%3a10.2p1-3_riscv64.deb' to '/srv/rebuilderd/tmp/tmpspf4z2dl/openssh-client_10.2p1-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libubsan1 riscv64 15.2.0-12 [1178 kB] Fetched 1178 kB in 0s (14.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg_fodfle/libubsan1_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcrypt1 riscv64 1:4.5.1-1 [114 kB] Fetched 114 kB in 0s (2056 kB/s) dpkg-name: info: moved 'libcrypt1_1%3a4.5.1-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp_10936fc/libcrypt1_4.5.1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc-15-riscv64-linux-gnu riscv64 15.2.0-12 [28.6 MB] Fetched 28.6 MB in 1s (42.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeefeuzh5/gcc-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmpfr6 riscv64 4.2.2-2+b1 [667 kB] Fetched 667 kB in 0s (19.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3z81oanb/libmpfr6_4.2.2-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libaec-dev riscv64 1.1.4-2+b1 [47.7 kB] Fetched 47.7 kB in 0s (2294 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpv_tlt1vm/libaec-dev_1.1.4-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 opencl-c-headers all 3.0~2025.07.22-2 [47.6 kB] Fetched 47.6 kB in 0s (895 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpuqz0yh17/opencl-c-headers_3.0~2025.07.22-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscalapack-openmpi-dev riscv64 2.2.2-5 [11.6 kB] Fetched 11.6 kB in 0s (220 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw_3tfh42/libscalapack-openmpi-dev_2.2.2-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc-riscv64-linux-gnu riscv64 4:15.2.0-5 [1432 B] Fetched 1432 B in 0s (73.1 kB/s) dpkg-name: info: moved 'gcc-riscv64-linux-gnu_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmp51bx4s9y/gcc-riscv64-linux-gnu_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-64-dev riscv64 7.0.10-7 [17.1 kB] Fetched 17.1 kB in 0s (327 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq5mmgpbm/libptscotch-64-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-310 riscv64 1.14.6+repack-2 [1442 kB] Fetched 1442 kB in 0s (15.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnus87tvf/libhdf5-openmpi-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libdebconfclient0 riscv64 0.282+b2 [11.0 kB] Fetched 11.0 kB in 0s (549 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj2dp3rk1/libdebconfclient0_0.282+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3-click all 8.2.0+0.really.8.1.8-1 [95.4 kB] Fetched 95.4 kB in 0s (1724 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3pn4avq2/python3-click_8.2.0+0.really.8.1.8-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-7.0c riscv64 7.0.10-7 [258 kB] Fetched 258 kB in 0s (4366 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpb4f75jnk/libscotch-7.0c_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl riscv64 5.40.1-7 [267 kB] Fetched 267 kB in 0s (10.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy1wab9cv/perl_5.40.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcap2 riscv64 1:2.75-10+b5 [29.1 kB] Fetched 29.1 kB in 0s (1428 kB/s) dpkg-name: info: moved 'libcap2_1%3a2.75-10+b5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpmk4epk2y/libcap2_2.75-10+b5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sensible-utils all 0.0.26 [27.0 kB] Fetched 27.0 kB in 0s (1341 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdhmwa7iv/sensible-utils_0.0.26_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-64-7.0 riscv64 7.0.10-7 [259 kB] Fetched 259 kB in 0s (4464 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt3npj8k9/libscotch-64-7.0_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarpack2t64 riscv64 3.9.1-6+b1 [94.2 kB] Fetched 94.2 kB in 0s (1729 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprl34bzlq/libarpack2t64_3.9.1-6+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libc-dev-bin riscv64 2.42-10+b1 [60.3 kB] Fetched 60.3 kB in 0s (2867 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpe2su4o1b/libc-dev-bin_2.42-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotcherr-dev riscv64 7.0.10-7 [11.9 kB] Fetched 11.9 kB in 0s (227 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprx9rnj9s/libscotcherr-dev_7.0.10-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 pkgconf riscv64 1.8.1-4+b1 [26.5 kB] Fetched 26.5 kB in 0s (502 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd4091wn0/pkgconf_1.8.1-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbsd0 riscv64 0.12.2-2+b1 [132 kB] Fetched 132 kB in 0s (5816 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg8xws6vn/libbsd0_0.12.2-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libltdl7 riscv64 2.5.4-9 [416 kB] Fetched 416 kB in 0s (6793 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpu2h1tnp2/libltdl7_2.5.4-9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3 riscv64 3.13.9-3 [27.6 kB] Fetched 27.6 kB in 0s (522 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_jldb409/python3_3.13.9-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc-plugins riscv64 2.12.2-1+b1 [17.9 kB] Fetched 17.9 kB in 0s (340 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpllbt33r3/libhwloc-plugins_2.12.2-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libunistring5 riscv64 1.3-2+b1 [471 kB] Fetched 471 kB in 0s (15.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpuuz296m3/libunistring5_1.3-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libc-gconv-modules-extra riscv64 2.42-10+b1 [1128 kB] Fetched 1128 kB in 0s (25.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp64je0kku/libc-gconv-modules-extra_2.42-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjansson4 riscv64 2.14-2+b4 [40.1 kB] Fetched 40.1 kB in 0s (1957 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphpmold9w/libjansson4_2.14-2+b4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgfortran5 riscv64 15.2.0-12 [420 kB] Fetched 420 kB in 0s (6852 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5r4_r7c0/libgfortran5_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp riscv64 4:15.2.0-5 [1572 B] Fetched 1572 B in 0s (79.8 kB/s) dpkg-name: info: moved 'cpp_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmp_y41nd_m/cpp_15.2.0-5_riscv64.deb' Downloading dependency 1 of 392: openssl-provider-legacy:riscv64=3.5.4-1+b1 Downloading dependency 2 of 392: mawk:riscv64=1.3.4.20250131-2 Downloading dependency 3 of 392: libparpack2t64:riscv64=3.9.1-6+b1 Downloading dependency 4 of 392: libxdmcp6:riscv64=1:1.1.5-2 Downloading dependency 5 of 392: libzstd1:riscv64=1.5.7+dfsg-3 Downloading dependency 6 of 392: libbrotli-dev:riscv64=1.1.0-2+b9 Downloading dependency 7 of 392: libcxsparse4:riscv64=1:7.12.1+dfsg-1 Downloading dependency 8 of 392: libxcb1:riscv64=1.17.0-2+b2 Downloading dependency 9 of 392: libhypre64-dev:riscv64=3.0.0-5 Downloading dependency 10 of 392: libcom-err2:riscv64=1.47.2-3+b8 Downloading dependency 11 of 392: libcamd3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 12 of 392: libucx0:riscv64=1.20.0+ds-4 Downloading dependency 13 of 392: libgmp10:riscv64=2:6.3.0+dfsg-5+b1 Downloading dependency 14 of 392: libcbor0.10:riscv64=0.10.2-2.1 Downloading dependency 15 of 392: libfortran-toml-0:riscv64=0.4.3-2 Downloading dependency 16 of 392: fortran-fpm:riscv64=0.12.0-6 Downloading dependency 17 of 392: libarpack2-dev:riscv64=3.9.1-6+b1 Downloading dependency 18 of 392: libmunge2:riscv64=0.5.16-1+b1 Downloading dependency 19 of 392: dh-autoreconf:riscv64=21 Downloading dependency 20 of 392: comerr-dev:riscv64=2.1-1.47.2-3+b8 Downloading dependency 21 of 392: libptscotch-64i-7.0:riscv64=7.0.10-7 Downloading dependency 22 of 392: librtmp1:riscv64=2.4+20151223.gitfa8646d.1-3+b1 Downloading dependency 23 of 392: gcc-15-base:riscv64=15.2.0-12 Downloading dependency 24 of 392: fonts-mathjax:riscv64=2.7.9+dfsg-1 Downloading dependency 25 of 392: libatomic1:riscv64=15.2.0-12 Downloading dependency 26 of 392: libxdmcp-dev:riscv64=1:1.1.5-2 Downloading dependency 27 of 392: libssl3t64:riscv64=3.5.4-1+b1 Downloading dependency 28 of 392: libxau-dev:riscv64=1:1.0.11-1+b1 Downloading dependency 29 of 392: dh-python:riscv64=7.20260125 Downloading dependency 30 of 392: libxau6:riscv64=1:1.0.11-1+b1 Downloading dependency 31 of 392: libpipeline1:riscv64=1.5.8-2 Downloading dependency 32 of 392: libjpeg-dev:riscv64=1:2.1.5-4 Downloading dependency 33 of 392: libbtf2:riscv64=1:7.12.1+dfsg-1 Downloading dependency 34 of 392: libklu2:riscv64=1:7.12.1+dfsg-1 Downloading dependency 35 of 392: libxnvctrl0:riscv64=535.171.04-1+b3 Downloading dependency 36 of 392: libltdl-dev:riscv64=2.5.4-9 Downloading dependency 37 of 392: libattr1:riscv64=1:2.5.2-3+b1 Downloading dependency 38 of 392: debconf:riscv64=1.5.91 Downloading dependency 39 of 392: dash:riscv64=0.5.12-12 Downloading dependency 40 of 392: libopenmpi40:riscv64=5.0.9-1 Downloading dependency 41 of 392: gfortran-15-riscv64-linux-gnu:riscv64=15.2.0-12 Downloading dependency 42 of 392: libk5crypto3:riscv64=1.22.1-2 Downloading dependency 43 of 392: libpkgconf3:riscv64=1.8.1-4+b1 Downloading dependency 44 of 392: libkdb5-10t64:riscv64=1.22.1-2 Downloading dependency 45 of 392: libpetsc3.24-dev-common:riscv64=3.24.3+dfsg1-1 Downloading dependency 46 of 392: zlib1g-dev:riscv64=1:1.3.dfsg+really1.3.1-1+b2 Downloading dependency 47 of 392: libnghttp2-dev:riscv64=1.64.0-1.1+b1 Downloading dependency 48 of 392: libscalapack-openmpi2.2:riscv64=2.2.2-5 Downloading dependency 49 of 392: libgdbm6t64:riscv64=1.26-1+b1 Downloading dependency 50 of 392: libkrb5support0:riscv64=1.22.1-2 Downloading dependency 51 of 392: libcurl4t64:riscv64=8.18.0-2 Downloading dependency 52 of 392: libpetsc-real3.24:riscv64=3.24.3+dfsg1-1 Downloading dependency 53 of 392: libblas-dev:riscv64=3.12.1-7+b1 Downloading dependency 54 of 392: libngtcp2-dev:riscv64=1.16.0-1 Downloading dependency 55 of 392: libptscotch-64-7.0:riscv64=7.0.10-7 Downloading dependency 56 of 392: libopenblas64-0-pthread:riscv64=0.3.30+ds-3+b1 Downloading dependency 57 of 392: libgmpxx4ldbl:riscv64=2:6.3.0+dfsg-5+b1 Downloading dependency 58 of 392: libhypre-dev:riscv64=3.0.0-5 Downloading dependency 59 of 392: opencl-clhpp-headers:riscv64=3.0~2025.07.22-1 Downloading dependency 60 of 392: libfftw3-double3:riscv64=3.3.10-2+b2 Downloading dependency 61 of 392: libmumps64-dev:riscv64=5.8.1-2 Downloading dependency 62 of 392: libhdf5-openmpi-cpp-310:riscv64=1.14.6+repack-2 Downloading dependency 63 of 392: libyaml-dev:riscv64=0.2.5-2+b1 Downloading dependency 64 of 392: base-files:riscv64=14 Downloading dependency 65 of 392: libunbound8:riscv64=1.24.2-1 Downloading dependency 66 of 392: libsasl2-2:riscv64=2.1.28+dfsg1-10 Downloading dependency 67 of 392: build-essential:riscv64=12.12 Downloading dependency 68 of 392: libpam-modules-bin:riscv64=1.7.0-5+b1 Downloading dependency 69 of 392: libsmartcols1:riscv64=2.41.3-3 Downloading dependency 70 of 392: dh-fortran:riscv64=0.63 Downloading dependency 71 of 392: libnl-3-200:riscv64=3.12.0-2 Downloading dependency 72 of 392: libc-bin:riscv64=2.42-10+b1 Downloading dependency 73 of 392: libparpack2-dev:riscv64=3.9.1-6+b1 Downloading dependency 74 of 392: zlib1g:riscv64=1:1.3.dfsg+really1.3.1-1+b2 Downloading dependency 75 of 392: openmpi-common:riscv64=5.0.9-1 Downloading dependency 76 of 392: libgprofng0:riscv64=2.45.50.20260119-1 Downloading dependency 77 of 392: libscotch-dev:riscv64=7.0.10-7 Downloading dependency 78 of 392: sed:riscv64=4.9-2 Downloading dependency 79 of 392: libbinutils:riscv64=2.45.50.20260119-1 Downloading dependency 80 of 392: base-passwd:riscv64=3.6.8 Downloading dependency 81 of 392: libibverbs1:riscv64=61.0-2 Downloading dependency 82 of 392: libisl23:riscv64=0.27-1+b1 Downloading dependency 83 of 392: libblas64-3:riscv64=3.12.1-7+b1 Downloading dependency 84 of 392: libmpc3:riscv64=1.3.1-2+b1 Downloading dependency 85 of 392: libnl-route-3-200:riscv64=3.12.0-2 Downloading dependency 86 of 392: libsuperlu-dist9:riscv64=9.2.1+dfsg1-1 Downloading dependency 87 of 392: libnl-3-dev:riscv64=3.12.0-2 Downloading dependency 88 of 392: libkadm5srv-mit12:riscv64=1.22.1-2 Downloading dependency 89 of 392: login.defs:riscv64=1:4.19.0-4 Downloading dependency 90 of 392: x11proto-dev:riscv64=2024.1-1 Downloading dependency 91 of 392: intltool-debian:riscv64=0.35.0+20060710.6 Downloading dependency 92 of 392: openssh-client:riscv64=1:10.2p1-3 Downloading dependency 93 of 392: libubsan1:riscv64=15.2.0-12 Downloading dependency 94 of 392: libcrypt1:riscv64=1:4.5.1-1 Downloading dependency 95 of 392: gcc-15-riscv64-linux-gnu:riscv64=15.2.0-12 Downloading dependency 96 of 392: libmpfr6:riscv64=4.2.2-2+b1 Downloading dependency 97 of 392: libaec-dev:riscv64=1.1.4-2+b1 Downloading dependency 98 of 392: opencl-c-headers:riscv64=3.0~2025.07.22-2 Downloading dependency 99 of 392: libscalapack-openmpi-dev:riscv64=2.2.2-5 Downloading dependency 100 of 392: gcc-riscv64-linux-gnu:riscv64=4:15.2.0-5 Downloading dependency 101 of 392: libptscotch-64-dev:riscv64=7.0.10-7 Downloading dependency 102 of 392: libhdf5-openmpi-310:riscv64=1.14.6+repack-2 Downloading dependency 103 of 392: libdebconfclient0:riscv64=0.282+b2 Downloading dependency 104 of 392: python3-click:riscv64=8.2.0+0.really.8.1.8-1 Downloading dependency 105 of 392: libscotch-7.0c:riscv64=7.0.10-7 Downloading dependency 106 of 392: perl:riscv64=5.40.1-7 Downloading dependency 107 of 392: libcap2:riscv64=1:2.75-10+b5 Downloading dependency 108 of 392: sensible-utils:riscv64=0.0.26 Downloading dependency 109 of 392: libscotch-64-7.0:riscv64=7.0.10-7 Downloading dependency 110 of 392: libarpack2t64:riscv64=3.9.1-6+b1 Downloading dependency 111 of 392: libc-dev-bin:riscv64=2.42-10+b1 Downloading dependency 112 of 392: libscotcherr-dev:riscv64=7.0.10-7 Downloading dependency 113 of 392: pkgconf:riscv64=1.8.1-4+b1 Downloading dependency 114 of 392: libbsd0:riscv64=0.12.2-2+b1 Downloading dependency 115 of 392: libltdl7:riscv64=2.5.4-9 Downloading dependency 116 of 392: python3:riscv64=3.13.9-3 Downloading dependency 117 of 392: libhwloc-plugins:riscv64=2.12.2-1+b1 Downloading dependency 118 of 392: libunistring5:riscv64=1.3-2+b1 Downloading dependency 119 of 392: libc-gconv-modules-extra:riscv64=2.42-10+b1 Downloading dependency 120 of 392: libjansson4:riscv64=2.14-2+b4 Downloading dependency 121 of 392: libgfortran5:riscv64=15.2.0-12 Downloading dependency 122 of 392: cpp:riscv64=4:15.2.0-5 Downloading dependency 123 of 392: libptscotch-dev:riscv64=7.0.10-7Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-dev riscv64 7.0.10-7 [846 kB] Fetched 846 kB in 0s (11.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgwia0_08/libptscotch-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssh2-1-dev riscv64 1.11.1-1+b1 [593 kB] Fetched 593 kB in 0s (9113 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkq5nn44b/libssh2-1-dev_1.11.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtool all 2.5.4-9 [540 kB] Fetched 540 kB in 0s (17.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6jb5k3on/libtool_2.5.4-9_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ncurses-base all 6.6+20251231-1 [277 kB] Fetched 277 kB in 0s (10.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfu3srslv/ncurses-base_6.6+20251231-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 patchelf riscv64 0.18.0-1.4 [103 kB] Fetched 103 kB in 0s (4723 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpuerjic9d/patchelf_0.18.0-1.4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl-base riscv64 5.40.1-7 [1678 kB] Fetched 1678 kB in 0s (18.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwxz19mzn/perl-base_5.40.1-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 binutils riscv64 2.45.50.20260119-1 [285 kB] Fetched 285 kB in 0s (10.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgec5vsy4/binutils_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotcherr-7.0 riscv64 7.0.10-7 [12.5 kB] Fetched 12.5 kB in 0s (239 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzh80w1l1/libscotcherr-7.0_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgnutls-openssl27t64 riscv64 3.8.11-3 [467 kB] Fetched 467 kB in 0s (7494 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvednx8_m/libgnutls-openssl27t64_3.8.11-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-dev riscv64 3.3.10-2+b2 [1479 kB] Fetched 1479 kB in 0s (16.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk0oq5ftl/libfftw3-dev_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libparu1 riscv64 1:7.12.1+dfsg-1 [73.8 kB] Fetched 73.8 kB in 0s (1442 kB/s) dpkg-name: info: moved 'libparu1_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp7eo0ovoe/libparu1_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpsl5t64 riscv64 0.21.2-1.1+b2 [59.8 kB] Fetched 59.8 kB in 0s (2846 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4_xye8d_/libpsl5t64_0.21.2-1.1+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libudev1 riscv64 259-1 [159 kB] Fetched 159 kB in 0s (6841 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9n1iaioc/libudev1_259-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-64i-dev riscv64 7.0.10-7 [17.1 kB] Fetched 17.1 kB in 0s (326 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpua579d1e/libptscotch-64i-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-single3 riscv64 3.3.10-2+b2 [379 kB] Fetched 379 kB in 0s (6265 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7hrhnl2f/libfftw3-single3_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 bzip2 riscv64 1.0.8-6+b1 [40.8 kB] Fetched 40.8 kB in 0s (1967 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp37rmbsg7/bzip2_1.0.8-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtasn1-6 riscv64 4.21.0-2 [50.7 kB] Fetched 50.7 kB in 0s (937 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphymnx15_/libtasn1-6_4.21.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpmix2t64 riscv64 6.0.0+really5.0.9-3 [661 kB] Fetched 661 kB in 0s (9979 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt4pt29qd/libpmix2t64_6.0.0+really5.0.9-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcurl4-openssl-dev riscv64 8.18.0-2 [1329 kB] Fetched 1329 kB in 0s (16.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8wawiag_/libcurl4-openssl-dev_8.18.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc64-real3.24 riscv64 3.24.3+dfsg1-1 [6686 kB] Fetched 6686 kB in 0s (25.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1zi7ofw5/libpetsc64-real3.24_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp3-9 riscv64 1.12.0-1 [68.4 kB] Fetched 68.4 kB in 0s (3190 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmuw8wpnk/libnghttp3-9_1.12.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 coreutils riscv64 9.7-3 [3036 kB] Fetched 3036 kB in 0s (25.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptbxw9umi/coreutils_9.7-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 patch riscv64 2.8-2 [134 kB] Fetched 134 kB in 0s (5875 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpx8ehkap3/patch_2.8-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++ riscv64 4:15.2.0-5 [1332 B] Fetched 1332 B in 0s (67.2 kB/s) dpkg-name: info: moved 'g++_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmp1rta5dsb/g++_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnuma1 riscv64 2.0.19-1+b1 [23.3 kB] Fetched 23.3 kB in 0s (1143 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpx58ogd09/libnuma1_2.0.19-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-64pord-5.8 riscv64 5.8.1-2 [1838 kB] Fetched 1838 kB in 0s (18.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkfmmhaz5/libmumps-64pord-5.8_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libp11-kit-dev riscv64 0.25.10-1+b1 [221 kB] Fetched 221 kB in 0s (3850 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpao4xf_c6/libp11-kit-dev_0.25.10-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-mpi-dev riscv64 3.3.10-2+b2 [76.0 kB] Fetched 76.0 kB in 0s (1405 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyhzt236v/libfftw3-mpi-dev_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-64-dev riscv64 7.0.10-7 [19.7 kB] Fetched 19.7 kB in 0s (987 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp38gm2se9/libscotch-64-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc15 riscv64 2.12.2-1+b1 [157 kB] Fetched 157 kB in 0s (2803 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbuyj8po0/libhwloc15_2.12.2-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libffi8 riscv64 3.5.2-3+b1 [22.3 kB] Fetched 22.3 kB in 0s (1102 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpno7g9e09/libffi8_3.5.2-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 readline-common all 8.3-3 [74.8 kB] Fetched 74.8 kB in 0s (1380 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbb3cf2s2/readline-common_8.3-3_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 grep riscv64 3.12-1 [442 kB] Fetched 442 kB in 0s (15.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplpwfxhnr/grep_3.12-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libncursesw6 riscv64 6.6+20251231-1 [141 kB] Fetched 141 kB in 0s (6208 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9cn069q4/libncursesw6_6.6+20251231-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ibverbs-providers riscv64 61.0-2 [400 kB] Fetched 400 kB in 0s (6438 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl_efx4m6/ibverbs-providers_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkadm5clnt-mit12 riscv64 1.22.1-2 [42.2 kB] Fetched 42.2 kB in 0s (792 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpts9t1w75/libkadm5clnt-mit12_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3-minimal riscv64 3.13.9-3 [27.6 kB] Fetched 27.6 kB in 0s (523 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyg4g04x6/python3-minimal_3.13.9-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-cpp-310 riscv64 1.14.6+repack-2 [19.8 kB] Fetched 19.8 kB in 0s (376 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpu9pcxlmk/libhdf5-openmpi-hl-cpp-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmount1 riscv64 2.41.3-3 [226 kB] Fetched 226 kB in 0s (9178 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5sck_jdx/libmount1_2.41.3-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 debianutils riscv64 5.23.2 [91.7 kB] Fetched 91.7 kB in 0s (4223 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqd7buyb0/debianutils_5.23.2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3.13 riscv64 3.13.11-1+b1 [770 kB] Fetched 770 kB in 0s (11.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptdn0x9dl/python3.13_3.13.11-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfortran-jonquil-0 riscv64 0.3.0-4 [19.5 kB] Fetched 19.5 kB in 0s (370 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg1bvynhq/libfortran-jonquil-0_0.3.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc-real3.24-dev riscv64 3.24.3+dfsg1-1 [13.5 MB] Fetched 13.5 MB in 0s (27.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp07kmhp6i/libpetsc-real3.24-dev_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libdb5.3t64 riscv64 5.3.28+dfsg2-11 [719 kB] Fetched 719 kB in 0s (20.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbgg09u35/libdb5.3t64_5.3.28+dfsg2-11_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc-15 riscv64 15.2.0-12 [519 kB] Fetched 519 kB in 0s (16.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsqijdebk/gcc-15_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtsan2 riscv64 15.2.0-12 [2654 kB] Fetched 2654 kB in 0s (35.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3y4mgksz/libtsan2_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libldap2 riscv64 2.6.10+dfsg-1+b1 [196 kB] Fetched 196 kB in 0s (8167 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpx6wkn3st/libldap2_2.6.10+dfsg-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgcc-15-dev riscv64 15.2.0-12 [5623 kB] Fetched 5623 kB in 0s (41.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo6z5k9zw/libgcc-15-dev_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack-dev riscv64 3.12.1-7+b1 [12.0 MB] Fetched 12.0 MB in 0s (28.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpumw41z2y/liblapack-dev_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ocl-icd-libopencl1 riscv64 2.3.4-1 [42.3 kB] Fetched 42.3 kB in 0s (793 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptmfwkj3a/ocl-icd-libopencl1_2.3.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libspqr4 riscv64 1:7.12.1+dfsg-1 [139 kB] Fetched 139 kB in 0s (2504 kB/s) dpkg-name: info: moved 'libspqr4_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpiia1xi2h/libspqr4_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcolamd3 riscv64 1:7.12.1+dfsg-1 [41.3 kB] Fetched 41.3 kB in 0s (773 kB/s) dpkg-name: info: moved 'libcolamd3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmprqq740xm/libcolamd3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-0 riscv64 0.3.30+ds-3+b1 [43.8 kB] Fetched 43.8 kB in 0s (817 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppbt71yy1/libopenblas64-0_0.3.30+ds-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-pthread-dev riscv64 0.3.30+ds-3+b1 [8856 kB] Fetched 8856 kB in 0s (27.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0fvwxd01/libopenblas64-pthread-dev_0.3.30+ds-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 findutils riscv64 4.10.0-3 [706 kB] Fetched 706 kB in 0s (20.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbkbja7gt/findutils_4.10.0-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenmpi-dev riscv64 5.0.9-1 [1090 kB] Fetched 1090 kB in 0s (14.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpv29hyx66/libopenmpi-dev_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-headers-dev all 5.8.1-2 [36.4 kB] Fetched 36.4 kB in 0s (688 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgbuxkklz/libmumps-headers-dev_5.8.1-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 netbase all 6.5 [12.4 kB] Fetched 12.4 kB in 0s (237 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa8lgqkt_/netbase_6.5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxext6 riscv64 2:1.3.4-1+b4 [51.1 kB] Fetched 51.1 kB in 0s (957 kB/s) dpkg-name: info: moved 'libxext6_2%3a1.3.4-1+b4_riscv64.deb' to '/srv/rebuilderd/tmp/tmps9hibdlw/libxext6_1.3.4-1+b4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-jquery all 3.7.1+dfsg+~3.5.33-1 [319 kB] Fetched 319 kB in 0s (5385 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxga705bh/libjs-jquery_3.7.1+dfsg+~3.5.33-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgssrpc4t64 riscv64 1.22.1-2 [60.8 kB] Fetched 60.8 kB in 0s (1127 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpr5o5h57c/libgssrpc4t64_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libc6 riscv64 2.42-10+b1 [1422 kB] Fetched 1422 kB in 0s (16.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpell5syw1/libc6_2.42-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libstdc++6 riscv64 15.2.0-12 [715 kB] Fetched 715 kB in 0s (19.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpll3ysqkl/libstdc++6_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-2.1-7t64 riscv64 2.1.12-stable-10+b2 [183 kB] Fetched 183 kB in 0s (3208 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_6846loy/libevent-2.1-7t64_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuitesparseconfig7 riscv64 1:7.12.1+dfsg-1 [33.2 kB] Fetched 33.2 kB in 0s (628 kB/s) dpkg-name: info: moved 'libsuitesparseconfig7_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpqq5qyth_/libsuitesparseconfig7_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libedit2 riscv64 3.1-20251016-1 [92.5 kB] Fetched 92.5 kB in 0s (4267 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl0gx6xrt/libedit2_3.1-20251016-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-dev riscv64 2:1.8.12-1+b1 [1504 kB] Fetched 1504 kB in 0s (16.4 MB/s) dpkg-name: info: moved 'libx11-dev_2%3a1.8.12-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpby2f1iur/libx11-dev_1.8.12-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsqlite3-0 riscv64 3.46.1-9 [910 kB] Fetched 910 kB in 0s (12.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnm459xsa/libsqlite3-0_3.46.1-9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgomp1 riscv64 15.2.0-12 [132 kB] Fetched 132 kB in 0s (5861 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeug_gqmf/libgomp1_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-long3 riscv64 3.3.10-2+b2 [701 kB] Fetched 701 kB in 0s (12.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqsilcdnm/libfftw3-long3_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmetis5 riscv64 5.1.0.dfsg-8 [164 kB] Fetched 164 kB in 0s (2929 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3wl7uqwm/libmetis5_5.1.0.dfsg-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc-dev riscv64 2.12.2-1+b1 [577 kB] Fetched 577 kB in 0s (8910 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphtfc1y4p/libhwloc-dev_2.12.2-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmagic-mgc riscv64 1:5.46-5+b1 [338 kB] Fetched 338 kB in 0s (12.5 MB/s) dpkg-name: info: moved 'libmagic-mgc_1%3a5.46-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpcr_jdypg/libmagic-mgc_5.46-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 make riscv64 4.4.1-3 [463 kB] Fetched 463 kB in 0s (15.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3hxah8gg/make_4.4.1-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsemanage-common all 3.9-1 [7888 B] Fetched 7888 B in 0s (151 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpizse224u/libsemanage-common_3.9-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran-15 riscv64 15.2.0-12 [18.5 kB] Fetched 18.5 kB in 0s (353 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphtg6ntd4/gfortran-15_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfile-stripnondeterminism-perl all 1.15.0-1 [19.9 kB] Fetched 19.9 kB in 0s (982 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsttf__tu/libfile-stripnondeterminism-perl_1.15.0-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpsl-dev riscv64 0.21.2-1.1+b2 [91.7 kB] Fetched 91.7 kB in 0s (1683 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4xu3cbbp/libpsl-dev_0.21.2-1.1+b2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 binutils-common riscv64 2.45.50.20260119-1 [2544 kB] Fetched 2544 kB in 0s (23.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0jqf4sst/binutils-common_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libreadline8t64 riscv64 8.3-3+b1 [180 kB] Fetched 180 kB in 0s (3191 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcso067kr/libreadline8t64_8.3-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 linux-libc-dev all 6.18.5-1 [2554 kB] Fetched 2554 kB in 0s (23.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5l0sm_dr/linux-libc-dev_6.18.5-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsasl2-modules-db riscv64 2.1.28+dfsg1-10 [20.2 kB] Fetched 20.2 kB in 0s (381 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgicqa671/libsasl2-modules-db_2.1.28+dfsg1-10_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcc1-0 riscv64 15.2.0-12 [40.4 kB] Fetched 40.4 kB in 0s (1983 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl59da0qs/libcc1-0_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gzip riscv64 1.13-1 [139 kB] Fetched 139 kB in 0s (6137 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpiuqyjcb4/gzip_1.13-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-dev riscv64 1.14.6+repack-2 [7585 kB] Fetched 7585 kB in 0s (25.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3b7u0v_g/libhdf5-openmpi-dev_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpcre2-8-0 riscv64 10.46-1+b1 [297 kB] Fetched 297 kB in 0s (11.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp69oef0b2/libpcre2-8-0_10.46-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblkid1 riscv64 2.41.3-3 [194 kB] Fetched 194 kB in 0s (8124 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo8or63f2/libblkid1_2.41.3-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 mpi-default-dev riscv64 1.20 [3504 B] Fetched 3504 B in 0s (67.3 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1n8n3mvv/mpi-default-dev_1.20_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp2-14 riscv64 1.64.0-1.1+b1 [78.1 kB] Fetched 78.1 kB in 0s (3682 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcynicg7r/libnghttp2-14_1.64.0-1.1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-runtime all 1.7.0-5 [249 kB] Fetched 249 kB in 0s (9982 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5d1yuics/libpam-runtime_1.7.0-5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas3 riscv64 3.12.1-7+b1 [123 kB] Fetched 123 kB in 0s (2230 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpogu6jej9/libblas3_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 tzdata all 2025c-3 [263 kB] Fetched 263 kB in 0s (4487 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxefns1_c/tzdata_2025c-3_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu-dev riscv64 7.0.1+dfsg1-2+b1 [22.3 kB] Fetched 22.3 kB in 0s (424 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9ys3d_x9/libsuperlu-dev_7.0.1+dfsg1-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libamd3 riscv64 1:7.12.1+dfsg-1 [50.1 kB] Fetched 50.1 kB in 0s (941 kB/s) dpkg-name: info: moved 'libamd3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpulvt4mww/libamd3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-extra-2.1-7t64 riscv64 2.1.12-stable-10+b2 [108 kB] Fetched 108 kB in 0s (1970 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgeh53ids/libevent-extra-2.1-7t64_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librtmp-dev riscv64 2.4+20151223.gitfa8646d.1-3+b1 [133 kB] Fetched 133 kB in 0s (2403 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpful4k5yq/librtmp-dev_2.4+20151223.gitfa8646d.1-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc-complex3.24 riscv64 3.24.3+dfsg1-1 [6801 kB] Fetched 6801 kB in 0s (24.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpb8bo_6_e/libpetsc-complex3.24_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-core-2.1-7t64 riscv64 2.1.12-stable-10+b2 [133 kB] Fetched 133 kB in 0s (2386 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptr76jjw9/libevent-core-2.1-7t64_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libsframe3 riscv64 2.45.50.20260119-1 [85.5 kB] Fetched 85.5 kB in 0s (4013 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw7v0l5ym/libsframe3_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmd0 riscv64 1.1.0-2+b2 [36.7 kB] Fetched 36.7 kB in 0s (1813 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9hizvtk1/libmd0_1.1.0-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 nettle-dev riscv64 3.10.2-1 [1557 kB] Fetched 1557 kB in 0s (18.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpp8uf7ag0/nettle-dev_3.10.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libccolamd3 riscv64 1:7.12.1+dfsg-1 [47.4 kB] Fetched 47.4 kB in 0s (884 kB/s) dpkg-name: info: moved 'libccolamd3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp0x8_0rge/libccolamd3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-16 riscv64 1.16.0-1 [141 kB] Fetched 141 kB in 0s (6209 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpu8000p14/libngtcp2-16_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnettle8t64 riscv64 3.10.2-1 [332 kB] Fetched 332 kB in 0s (12.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj1tq0pm4/libnettle8t64_3.10.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfile-libmagic-perl riscv64 1.23-2+b2 [30.9 kB] Fetched 30.9 kB in 0s (587 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmply_zlvj4/libfile-libmagic-perl_1.23-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfido2-1 riscv64 1.16.0-2+b1 [83.1 kB] Fetched 83.1 kB in 0s (1508 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdag83ga0/libfido2-1_1.16.0-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 passwd riscv64 1:4.19.0-4 [1288 kB] Fetched 1288 kB in 0s (14.6 MB/s) dpkg-name: info: moved 'passwd_1%3a4.19.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpjz07katk/passwd_4.19.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcap-ng0 riscv64 0.8.5-4+b2 [17.1 kB] Fetched 17.1 kB in 0s (844 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpttyjgqqm/libcap-ng0_0.8.5-4+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-mpi-dev riscv64 1.14.6+repack-2 [14.2 kB] Fetched 14.2 kB in 0s (271 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn81y1xkg/libhdf5-mpi-dev_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg62-turbo riscv64 1:2.1.5-4 [155 kB] Fetched 155 kB in 0s (2782 kB/s) dpkg-name: info: moved 'libjpeg62-turbo_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpivo0w418/libjpeg62-turbo_2.1.5-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibmad5 riscv64 61.0-2 [45.0 kB] Fetched 45.0 kB in 0s (847 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf7s1nsf3/libibmad5_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libexpat1 riscv64 2.7.3-2 [106 kB] Fetched 106 kB in 0s (1913 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaee68y7c/libexpat1_2.7.3-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dwz riscv64 0.16-2 [115 kB] Fetched 115 kB in 0s (5197 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbonu3r84/dwz_0.16-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 automake all 1:1.18.1-3 [878 kB] Fetched 878 kB in 0s (22.9 MB/s) dpkg-name: info: moved 'automake_1%3a1.18.1-3_all.deb' to '/srv/rebuilderd/tmp/tmpl3o6pphp/automake_1.18.1-3_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 krb5-multidev riscv64 1.22.1-2 [127 kB] Fetched 127 kB in 0s (2304 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa74q17w3/krb5-multidev_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libldl3 riscv64 1:7.12.1+dfsg-1 [34.2 kB] Fetched 34.2 kB in 0s (615 kB/s) dpkg-name: info: moved 'libldl3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp8h3mcncl/libldl3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++-15 riscv64 15.2.0-12 [25.0 kB] Fetched 25.0 kB in 0s (1250 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph2y124tl/g++-15_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg62-turbo-dev riscv64 1:2.1.5-4 [420 kB] Fetched 420 kB in 0s (6737 kB/s) dpkg-name: info: moved 'libjpeg62-turbo-dev_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpguj0rewa/libjpeg62-turbo-dev_2.1.5-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 util-linux riscv64 2.41.3-3 [1159 kB] Fetched 1159 kB in 0s (26.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwt77p4y7/util-linux_2.41.3-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtasn1-6-dev riscv64 4.21.0-2 [161 kB] Fetched 161 kB in 0s (2876 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpc0p20z7x/libtasn1-6-dev_4.21.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3.13-minimal riscv64 3.13.11-1+b1 [2172 kB] Fetched 2172 kB in 0s (20.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdz6gy11i/python3.13-minimal_3.13.11-1+b1_riscv64.deb' Downloading dependency 124 of 392: libssh2-1-dev:riscv64=1.11.1-1+b1 Downloading dependency 125 of 392: libtool:riscv64=2.5.4-9 Downloading dependency 126 of 392: ncurses-base:riscv64=6.6+20251231-1 Downloading dependency 127 of 392: patchelf:riscv64=0.18.0-1.4 Downloading dependency 128 of 392: perl-base:riscv64=5.40.1-7 Downloading dependency 129 of 392: binutils:riscv64=2.45.50.20260119-1 Downloading dependency 130 of 392: libscotcherr-7.0:riscv64=7.0.10-7 Downloading dependency 131 of 392: libgnutls-openssl27t64:riscv64=3.8.11-3 Downloading dependency 132 of 392: libfftw3-dev:riscv64=3.3.10-2+b2 Downloading dependency 133 of 392: libparu1:riscv64=1:7.12.1+dfsg-1 Downloading dependency 134 of 392: libpsl5t64:riscv64=0.21.2-1.1+b2 Downloading dependency 135 of 392: libudev1:riscv64=259-1 Downloading dependency 136 of 392: libptscotch-64i-dev:riscv64=7.0.10-7 Downloading dependency 137 of 392: libfftw3-single3:riscv64=3.3.10-2+b2 Downloading dependency 138 of 392: bzip2:riscv64=1.0.8-6+b1 Downloading dependency 139 of 392: libtasn1-6:riscv64=4.21.0-2 Downloading dependency 140 of 392: libpmix2t64:riscv64=6.0.0+really5.0.9-3 Downloading dependency 141 of 392: libcurl4-openssl-dev:riscv64=8.18.0-2 Downloading dependency 142 of 392: libpetsc64-real3.24:riscv64=3.24.3+dfsg1-1 Downloading dependency 143 of 392: libnghttp3-9:riscv64=1.12.0-1 Downloading dependency 144 of 392: coreutils:riscv64=9.7-3 Downloading dependency 145 of 392: patch:riscv64=2.8-2 Downloading dependency 146 of 392: g++:riscv64=4:15.2.0-5 Downloading dependency 147 of 392: libnuma1:riscv64=2.0.19-1+b1 Downloading dependency 148 of 392: libmumps-64pord-5.8:riscv64=5.8.1-2 Downloading dependency 149 of 392: libp11-kit-dev:riscv64=0.25.10-1+b1 Downloading dependency 150 of 392: libfftw3-mpi-dev:riscv64=3.3.10-2+b2 Downloading dependency 151 of 392: libscotch-64-dev:riscv64=7.0.10-7 Downloading dependency 152 of 392: libhwloc15:riscv64=2.12.2-1+b1 Downloading dependency 153 of 392: libffi8:riscv64=3.5.2-3+b1 Downloading dependency 154 of 392: readline-common:riscv64=8.3-3 Downloading dependency 155 of 392: grep:riscv64=3.12-1 Downloading dependency 156 of 392: libncursesw6:riscv64=6.6+20251231-1 Downloading dependency 157 of 392: ibverbs-providers:riscv64=61.0-2 Downloading dependency 158 of 392: libkadm5clnt-mit12:riscv64=1.22.1-2 Downloading dependency 159 of 392: python3-minimal:riscv64=3.13.9-3 Downloading dependency 160 of 392: libhdf5-openmpi-hl-cpp-310:riscv64=1.14.6+repack-2 Downloading dependency 161 of 392: libmount1:riscv64=2.41.3-3 Downloading dependency 162 of 392: debianutils:riscv64=5.23.2 Downloading dependency 163 of 392: python3.13:riscv64=3.13.11-1+b1 Downloading dependency 164 of 392: libfortran-jonquil-0:riscv64=0.3.0-4 Downloading dependency 165 of 392: libpetsc-real3.24-dev:riscv64=3.24.3+dfsg1-1 Downloading dependency 166 of 392: libdb5.3t64:riscv64=5.3.28+dfsg2-11 Downloading dependency 167 of 392: gcc-15:riscv64=15.2.0-12 Downloading dependency 168 of 392: libtsan2:riscv64=15.2.0-12 Downloading dependency 169 of 392: libldap2:riscv64=2.6.10+dfsg-1+b1 Downloading dependency 170 of 392: libgcc-15-dev:riscv64=15.2.0-12 Downloading dependency 171 of 392: liblapack-dev:riscv64=3.12.1-7+b1 Downloading dependency 172 of 392: ocl-icd-libopencl1:riscv64=2.3.4-1 Downloading dependency 173 of 392: libspqr4:riscv64=1:7.12.1+dfsg-1 Downloading dependency 174 of 392: libcolamd3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 175 of 392: libopenblas64-0:riscv64=0.3.30+ds-3+b1 Downloading dependency 176 of 392: libopenblas64-pthread-dev:riscv64=0.3.30+ds-3+b1 Downloading dependency 177 of 392: findutils:riscv64=4.10.0-3 Downloading dependency 178 of 392: libopenmpi-dev:riscv64=5.0.9-1 Downloading dependency 179 of 392: libmumps-headers-dev:riscv64=5.8.1-2 Downloading dependency 180 of 392: netbase:riscv64=6.5 Downloading dependency 181 of 392: libxext6:riscv64=2:1.3.4-1+b4 Downloading dependency 182 of 392: libjs-jquery:riscv64=3.7.1+dfsg+~3.5.33-1 Downloading dependency 183 of 392: libgssrpc4t64:riscv64=1.22.1-2 Downloading dependency 184 of 392: libc6:riscv64=2.42-10+b1 Downloading dependency 185 of 392: libstdc++6:riscv64=15.2.0-12 Downloading dependency 186 of 392: libevent-2.1-7t64:riscv64=2.1.12-stable-10+b2 Downloading dependency 187 of 392: libsuitesparseconfig7:riscv64=1:7.12.1+dfsg-1 Downloading dependency 188 of 392: libedit2:riscv64=3.1-20251016-1 Downloading dependency 189 of 392: libx11-dev:riscv64=2:1.8.12-1+b1 Downloading dependency 190 of 392: libsqlite3-0:riscv64=3.46.1-9 Downloading dependency 191 of 392: libgomp1:riscv64=15.2.0-12 Downloading dependency 192 of 392: libfftw3-long3:riscv64=3.3.10-2+b2 Downloading dependency 193 of 392: libmetis5:riscv64=5.1.0.dfsg-8 Downloading dependency 194 of 392: libhwloc-dev:riscv64=2.12.2-1+b1 Downloading dependency 195 of 392: libmagic-mgc:riscv64=1:5.46-5+b1 Downloading dependency 196 of 392: make:riscv64=4.4.1-3 Downloading dependency 197 of 392: libsemanage-common:riscv64=3.9-1 Downloading dependency 198 of 392: gfortran-15:riscv64=15.2.0-12 Downloading dependency 199 of 392: libfile-stripnondeterminism-perl:riscv64=1.15.0-1 Downloading dependency 200 of 392: libpsl-dev:riscv64=0.21.2-1.1+b2 Downloading dependency 201 of 392: binutils-common:riscv64=2.45.50.20260119-1 Downloading dependency 202 of 392: libreadline8t64:riscv64=8.3-3+b1 Downloading dependency 203 of 392: linux-libc-dev:riscv64=6.18.5-1 Downloading dependency 204 of 392: libsasl2-modules-db:riscv64=2.1.28+dfsg1-10 Downloading dependency 205 of 392: libcc1-0:riscv64=15.2.0-12 Downloading dependency 206 of 392: gzip:riscv64=1.13-1 Downloading dependency 207 of 392: libhdf5-openmpi-dev:riscv64=1.14.6+repack-2 Downloading dependency 208 of 392: libpcre2-8-0:riscv64=10.46-1+b1 Downloading dependency 209 of 392: libblkid1:riscv64=2.41.3-3 Downloading dependency 210 of 392: mpi-default-dev:riscv64=1.20 Downloading dependency 211 of 392: libnghttp2-14:riscv64=1.64.0-1.1+b1 Downloading dependency 212 of 392: libpam-runtime:riscv64=1.7.0-5 Downloading dependency 213 of 392: libblas3:riscv64=3.12.1-7+b1 Downloading dependency 214 of 392: tzdata:riscv64=2025c-3 Downloading dependency 215 of 392: libsuperlu-dev:riscv64=7.0.1+dfsg1-2+b1 Downloading dependency 216 of 392: libamd3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 217 of 392: libevent-extra-2.1-7t64:riscv64=2.1.12-stable-10+b2 Downloading dependency 218 of 392: librtmp-dev:riscv64=2.4+20151223.gitfa8646d.1-3+b1 Downloading dependency 219 of 392: libpetsc-complex3.24:riscv64=3.24.3+dfsg1-1 Downloading dependency 220 of 392: libevent-core-2.1-7t64:riscv64=2.1.12-stable-10+b2 Downloading dependency 221 of 392: libsframe3:riscv64=2.45.50.20260119-1 Downloading dependency 222 of 392: libmd0:riscv64=1.1.0-2+b2 Downloading dependency 223 of 392: nettle-dev:riscv64=3.10.2-1 Downloading dependency 224 of 392: libccolamd3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 225 of 392: libngtcp2-16:riscv64=1.16.0-1 Downloading dependency 226 of 392: libnettle8t64:riscv64=3.10.2-1 Downloading dependency 227 of 392: libfile-libmagic-perl:riscv64=1.23-2+b2 Downloading dependency 228 of 392: libfido2-1:riscv64=1.16.0-2+b1 Downloading dependency 229 of 392: passwd:riscv64=1:4.19.0-4 Downloading dependency 230 of 392: libcap-ng0:riscv64=0.8.5-4+b2 Downloading dependency 231 of 392: libhdf5-mpi-dev:riscv64=1.14.6+repack-2 Downloading dependency 232 of 392: libjpeg62-turbo:riscv64=1:2.1.5-4 Downloading dependency 233 of 392: libibmad5:riscv64=61.0-2 Downloading dependency 234 of 392: libexpat1:riscv64=2.7.3-2 Downloading dependency 235 of 392: dwz:riscv64=0.16-2 Downloading dependency 236 of 392: automake:riscv64=1:1.18.1-3 Downloading dependency 237 of 392: krb5-multidev:riscv64=1.22.1-2 Downloading dependency 238 of 392: libldl3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 239 of 392: g++-15:riscv64=15.2.0-12 Downloading dependency 240 of 392: libjpeg62-turbo-dev:riscv64=1:2.1.5-4 Downloading dependency 241 of 392: util-linux:riscv64=2.41.3-3 Downloading dependency 242 of 392: libtasn1-6-dev:riscv64=4.21.0-2 Downloading dependency 243 of 392: python3.13-minimal:riscv64=3.13.11-1+b1 Downloading dependency 244 of 392: libhogweed6t64:riscv64=3.10.2-1Get:1 http://deb.debian.org/debian unstable/main riscv64 libhogweed6t64 riscv64 3.10.2-1 [336 kB] Fetched 336 kB in 0s (12.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp22kerok2/libhogweed6t64_3.10.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-dev riscv64 5.8.1-2 [7361 kB] Fetched 7361 kB in 0s (28.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoaqo0qxh/libmumps-dev_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 media-types all 14.0.0 [30.8 kB] Fetched 30.8 kB in 0s (1539 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3aluzv_s/media-types_14.0.0_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack3 riscv64 3.12.1-7+b1 [1971 kB] Fetched 1971 kB in 0s (21.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpara145k7/liblapack3_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++-riscv64-linux-gnu riscv64 4:15.2.0-5 [1196 B] Fetched 1196 B in 0s (59.4 kB/s) dpkg-name: info: moved 'g++-riscv64-linux-gnu_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpwg38b__t/g++-riscv64-linux-gnu_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack64-3 riscv64 3.12.1-7+b1 [1932 kB] Fetched 1932 kB in 0s (19.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph1y_s_3s/liblapack64-3_3.12.1-7+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 mpi-default-bin riscv64 1.20 [2724 B] Fetched 2724 B in 0s (52.4 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgawha0h7/mpi-default-bin_1.20_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgfortran-15-dev riscv64 15.2.0-12 [1310 kB] Fetched 1310 kB in 0s (15.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmporv4exg1/libgfortran-15-dev_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsepol2 riscv64 3.9-2 [306 kB] Fetched 306 kB in 0s (5130 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplsqj6e1m/libsepol2_3.9-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibverbs-dev riscv64 61.0-2 [1870 kB] Fetched 1870 kB in 0s (18.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqpukyvtw/libibverbs-dev_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssh2-1t64 riscv64 1.11.1-1+b1 [249 kB] Fetched 249 kB in 0s (10.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpha1bpd9l/libssh2-1t64_1.11.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-64i-7.0 riscv64 7.0.10-7 [253 kB] Fetched 253 kB in 0s (4402 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnbqgbhvf/libscotch-64i-7.0_7.0.10-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libctf-nobfd0 riscv64 2.45.50.20260119-1 [163 kB] Fetched 163 kB in 0s (7020 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4nerxb0k/libctf-nobfd0_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libelf1t64 riscv64 0.194-1 [190 kB] Fetched 190 kB in 0s (7941 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1facwewv/libelf1t64_0.194-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblzma5 riscv64 5.8.2-2 [329 kB] Fetched 329 kB in 0s (5462 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmploxb7b8n/liblzma5_5.8.2-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp-15 riscv64 15.2.0-12 [1276 B] Fetched 1276 B in 0s (63.6 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgogdc8jn/cpp-15_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuitesparse-mongoose3 riscv64 1:7.12.1+dfsg-1 [57.6 kB] Fetched 57.6 kB in 0s (1075 kB/s) dpkg-name: info: moved 'libsuitesparse-mongoose3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp4493pk4r/libsuitesparse-mongoose3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libasan8 riscv64 15.2.0-12 [2940 kB] Fetched 2940 kB in 0s (25.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgvpd9eo7/libasan8_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 groff-base riscv64 1.23.0-10 [1163 kB] Fetched 1163 kB in 0s (26.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8mx87w0v/groff-base_1.23.0-10_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnuma-dev riscv64 2.0.19-1+b1 [74.7 kB] Fetched 74.7 kB in 0s (3478 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpehhl6bri/libnuma-dev_2.0.19-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc-complex3.24-dev riscv64 3.24.3+dfsg1-1 [13.5 MB] Fetched 13.5 MB in 0s (27.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmps114zj5n/libpetsc-complex3.24-dev_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 xtrans-dev all 1.6.0-1 [93.5 kB] Fetched 93.5 kB in 0s (1712 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqqjdavl5/xtrans-dev_1.6.0-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libitm1 riscv64 15.2.0-12 [25.4 kB] Fetched 25.4 kB in 0s (1237 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9ngiyfka/libitm1_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbz2-1.0 riscv64 1.0.8-6+b1 [40.1 kB] Fetched 40.1 kB in 0s (1959 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3wl0tz4z/libbz2-1.0_1.0.8-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-310 riscv64 1.14.6+repack-2 [67.1 kB] Fetched 67.1 kB in 0s (1252 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6_aqyqpp/libhdf5-openmpi-hl-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmp-dev riscv64 2:6.3.0+dfsg-5+b1 [1103 kB] Fetched 1103 kB in 0s (25.4 MB/s) dpkg-name: info: moved 'libgmp-dev_2%3a6.3.0+dfsg-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpzp7o4nwy/libgmp-dev_6.3.0+dfsg-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 adduser all 3.154 [210 kB] Fetched 210 kB in 0s (3679 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxrgy7kml/adduser_3.154_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-bin riscv64 3.3.10-2+b2 [44.7 kB] Fetched 44.7 kB in 0s (839 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdmppbrn9/libfftw3-bin_3.3.10-2+b2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libctf0 riscv64 2.45.50.20260119-1 [96.3 kB] Fetched 96.3 kB in 0s (4449 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2ch4680r/libctf0_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsemanage2 riscv64 3.9-1+b1 [95.0 kB] Fetched 95.0 kB in 0s (1750 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq1tmxqkc/libsemanage2_3.9-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 m4 riscv64 1.4.20-2 [323 kB] Fetched 323 kB in 0s (12.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5gpywc1p/m4_1.4.20-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libacl1 riscv64 2.3.2-2+b2 [33.0 kB] Fetched 33.0 kB in 0s (1613 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvyzeluqq/libacl1_2.3.2-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libp11-kit0 riscv64 0.25.10-1+b1 [450 kB] Fetched 450 kB in 0s (15.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7_5mm855/libp11-kit0_0.25.10-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotch-7.0c riscv64 7.0.10-7 [168 kB] Fetched 168 kB in 0s (3002 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa3amarus/libptscotch-7.0c_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgnutls28-dev riscv64 3.8.11-3 [2886 kB] Fetched 2886 kB in 0s (23.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdbge3y2b/libgnutls28-dev_3.8.11-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbrotli1 riscv64 1.1.0-2+b9 [342 kB] Fetched 342 kB in 0s (12.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkbu5xck8/libbrotli1_1.1.0-2+b9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgnutls30t64 riscv64 3.8.11-3 [1481 kB] Fetched 1481 kB in 0s (17.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0dar4gc3/libgnutls30t64_3.8.11-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libuchardet0 riscv64 0.0.8-2+b1 [68.9 kB] Fetched 68.9 kB in 0s (3248 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphsc_nkri/libuchardet0_0.0.8-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 xz-utils riscv64 5.8.2-2 [705 kB] Fetched 705 kB in 0s (20.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn2ggoty4/xz-utils_5.8.2-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libdpkg-perl all 1.23.5 [668 kB] Fetched 668 kB in 0s (9939 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjvbtprdu/libdpkg-perl_1.23.5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libuuid1 riscv64 2.41.3-3 [40.3 kB] Fetched 40.3 kB in 0s (1978 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpr8v3ihnd/libuuid1_2.41.3-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libdebhelper-perl all 13.29 [92.6 kB] Fetched 92.6 kB in 0s (4270 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsmmucr7z/libdebhelper-perl_13.29_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpython3.13-stdlib riscv64 3.13.11-1+b1 [1931 kB] Fetched 1931 kB in 0s (19.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplv_ua2zx/libpython3.13-stdlib_3.13.11-1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 binutils-riscv64-linux-gnu riscv64 2.45.50.20260119-1 [901 kB] Fetched 901 kB in 0s (22.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph710o7mo/binutils-riscv64-linux-gnu_2.45.50.20260119-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-crypto-ossl-dev riscv64 1.16.0-1 [51.8 kB] Fetched 51.8 kB in 0s (2509 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpto1mbypj/libngtcp2-crypto-ossl-dev_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpython3-stdlib riscv64 3.13.9-3 [10.6 kB] Fetched 10.6 kB in 0s (202 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_g41ekvf/libpython3-stdlib_3.13.9-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libzstd-dev riscv64 1.5.7+dfsg-3 [1588 kB] Fetched 1588 kB in 0s (16.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvnafwhyu/libzstd-dev_1.5.7+dfsg-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp-15-riscv64-linux-gnu riscv64 15.2.0-12 [14.7 MB] Fetched 14.7 MB in 0s (40.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpon640eu2/cpp-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 man-db riscv64 2.13.1-1 [1458 kB] Fetched 1458 kB in 0s (28.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpacps3o2u/man-db_2.13.1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotcherr-dev riscv64 7.0.10-7 [11.9 kB] Fetched 11.9 kB in 0s (228 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_7gsjt_r/libptscotcherr-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfabric1 riscv64 2.1.0-1.1+b1 [600 kB] Fetched 600 kB in 0s (9205 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk5nbi2xq/libfabric1_2.1.0-1.1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-fortran-310 riscv64 1.14.6+repack-2 [112 kB] Fetched 112 kB in 0s (2041 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmps3fvwwlm/libhdf5-openmpi-fortran-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-jquery-ui all 1.13.2+dfsg-1 [250 kB] Fetched 250 kB in 0s (4349 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8ecdwjtb/libjs-jquery-ui_1.13.2+dfsg-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 debhelper all 13.29 [943 kB] Fetched 943 kB in 0s (23.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdr38q74h/debhelper_13.29_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgdbm-compat4t64 riscv64 1.26-1+b1 [53.1 kB] Fetched 53.1 kB in 0s (2568 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfktuqf_b/libgdbm-compat4t64_1.26-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu7 riscv64 7.0.1+dfsg1-2+b1 [157 kB] Fetched 157 kB in 0s (2815 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5xinp66o/libsuperlu7_7.0.1+dfsg1-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsystemd0 riscv64 259-1 [480 kB] Fetched 480 kB in 0s (16.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqpqlew7o/libsystemd0_259-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libptscotcherr-7.0 riscv64 7.0.10-7 [12.5 kB] Fetched 12.5 kB in 0s (510 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptgkgs8zx/libptscotcherr-7.0_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-mpi3 riscv64 3.3.10-2+b2 [57.4 kB] Fetched 57.4 kB in 0s (1077 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwei_g064/libfftw3-mpi3_3.3.10-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autotools-dev all 20240727.1 [60.2 kB] Fetched 60.2 kB in 0s (2897 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxq_cid7e/autotools-dev_20240727.1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtinfo6 riscv64 6.6+20251231-1 [352 kB] Fetched 352 kB in 0s (12.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa74fszul/libtinfo6_6.6+20251231-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc64-complex3.24-dev riscv64 3.24.3+dfsg1-1 [13.6 MB] Fetched 13.6 MB in 1s (25.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpprsi8hb1/libpetsc64-complex3.24-dev_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxml2-16 riscv64 2.15.1+dfsg-2+b1 [637 kB] Fetched 637 kB in 0s (19.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjfux_yb8/libxml2-16_2.15.1+dfsg-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcholmod5 riscv64 1:7.12.1+dfsg-1 [654 kB] Fetched 654 kB in 0s (9927 kB/s) dpkg-name: info: moved 'libcholmod5_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpj7x04dif/libcholmod5_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autopoint all 0.23.2-1 [772 kB] Fetched 772 kB in 0s (21.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnk7rnyh3/autopoint_0.23.2-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dh-strip-nondeterminism all 1.15.0-1 [8812 B] Fetched 8812 B in 0s (367 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3_dx2or8/dh-strip-nondeterminism_1.15.0-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 rpcsvc-proto riscv64 1.4.3-1+b2 [62.3 kB] Fetched 62.3 kB in 0s (2940 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp17dg5e78/rpcsvc-proto_1.4.3-1+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libidn2-dev riscv64 2.3.8-4+b1 [142 kB] Fetched 142 kB in 0s (2566 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1lc3kyp7/libidn2-dev_2.3.8-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuitesparse-dev riscv64 1:7.12.1+dfsg-1 [4208 kB] Fetched 4208 kB in 0s (26.4 MB/s) dpkg-name: info: moved 'libsuitesparse-dev_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpow0xjzl1/libsuitesparse-dev_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libldap-dev riscv64 2.6.10+dfsg-1+b1 [635 kB] Fetched 635 kB in 0s (9674 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8fg66o9o/libldap-dev_2.6.10+dfsg-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5-3 riscv64 1.22.1-2 [346 kB] Fetched 346 kB in 0s (12.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg51ki0g_/libkrb5-3_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 file riscv64 1:5.46-5+b1 [43.5 kB] Fetched 43.5 kB in 0s (2110 kB/s) dpkg-name: info: moved 'file_1%3a5.46-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpzzk8zi1w/file_5.46-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-crypto-ossl0 riscv64 1.16.0-1 [27.7 kB] Fetched 27.7 kB in 0s (1182 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfcox3mx5/libngtcp2-crypto-ossl0_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dpkg-dev all 1.23.5 [1318 kB] Fetched 1318 kB in 0s (15.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplj9wuign/dpkg-dev_1.23.5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 init-system-helpers all 1.69 [39.3 kB] Fetched 39.3 kB in 0s (1923 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9ctksyue/init-system-helpers_1.69_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu-dist-dev riscv64 9.2.1+dfsg1-1 [4845 kB] Fetched 4845 kB in 0s (31.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd5czs6zv/libsuperlu-dist-dev_9.2.1+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxcb1-dev riscv64 1.17.0-2+b2 [263 kB] Fetched 263 kB in 0s (10.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw28qyi_u/libxcb1-dev_1.17.0-2+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 chrpath riscv64 0.18-1 [14.0 kB] Fetched 14.0 kB in 0s (268 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl4pf88eo/chrpath_0.18-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ncurses-bin riscv64 6.6+20251231-1 [443 kB] Fetched 443 kB in 0s (15.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppi1ai7f2/ncurses-bin_6.6+20251231-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3-magic all 2:0.4.27-3 [14.6 kB] Fetched 14.6 kB in 0s (274 kB/s) dpkg-name: info: moved 'python3-magic_2%3a0.4.27-3_all.deb' to '/srv/rebuilderd/tmp/tmpxj3hun5j/python3-magic_0.4.27-3_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibumad3 riscv64 61.0-2 [30.0 kB] Fetched 30.0 kB in 0s (570 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0x28rvyb/libibumad3_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc64-complex3.24 riscv64 3.24.3+dfsg1-1 [6766 kB] Fetched 6766 kB in 0s (23.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1jfg4mez/libpetsc64-complex3.24_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librbio4 riscv64 1:7.12.1+dfsg-1 [48.6 kB] Fetched 48.6 kB in 0s (911 kB/s) dpkg-name: info: moved 'librbio4_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpfj33ywrv/librbio4_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgssapi-krb5-2 riscv64 1.22.1-2 [141 kB] Fetched 141 kB in 0s (6263 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprag6hu2t/libgssapi-krb5-2_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libaudit-common all 1:4.1.2-1 [14.3 kB] Fetched 14.3 kB in 0s (726 kB/s) dpkg-name: info: moved 'libaudit-common_1%3a4.1.2-1_all.deb' to '/srv/rebuilderd/tmp/tmp5oj9k18u/libaudit-common_4.1.2-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 bsdextrautils riscv64 2.41.3-3 [102 kB] Fetched 102 kB in 0s (4694 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmperinco1a/bsdextrautils_2.41.3-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 po-debconf all 1.0.22 [216 kB] Fetched 216 kB in 0s (8929 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6_bgezti/po-debconf_1.0.22_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 xorg-sgml-doctools all 1:1.11-1.1 [22.1 kB] Fetched 22.1 kB in 0s (419 kB/s) dpkg-name: info: moved 'xorg-sgml-doctools_1%3a1.11-1.1_all.deb' to '/srv/rebuilderd/tmp/tmpix345eit/xorg-sgml-doctools_1.11-1.1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libaudit1 riscv64 1:4.1.2-1+b1 [58.7 kB] Fetched 58.7 kB in 0s (2820 kB/s) dpkg-name: info: moved 'libaudit1_1%3a4.1.2-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp48fggyk6/libaudit1_4.1.2-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gettext-base riscv64 0.23.2-1 [244 kB] Fetched 244 kB in 0s (9811 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9gqs3ijl/gettext-base_0.23.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgnutls-dane0t64 riscv64 3.8.11-3 [467 kB] Fetched 467 kB in 0s (7442 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqmqinc21/libgnutls-dane0t64_3.8.11-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sysvinit-utils riscv64 3.15-6 [34.6 kB] Fetched 34.6 kB in 0s (1716 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw9whi_l2/sysvinit-utils_3.15-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libperl5.40 riscv64 5.40.1-7 [3952 kB] Fetched 3952 kB in 0s (28.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaf2kqe60/libperl5.40_5.40.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgcc-s1 riscv64 15.2.0-12 [61.5 kB] Fetched 61.5 kB in 0s (1117 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpah4dsajq/libgcc-s1_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libidn2-0 riscv64 2.3.8-4+b1 [110 kB] Fetched 110 kB in 0s (4963 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj4my41v1/libidn2-0_2.3.8-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc3.24-dev-examples all 3.24.3+dfsg1-1 [3571 kB] Fetched 3571 kB in 1s (4239 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn_x4qfse/libpetsc3.24-dev-examples_3.24.3+dfsg1-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnl-route-3-dev riscv64 3.12.0-2 [626 kB] Fetched 626 kB in 0s (9605 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj534mvi3/libnl-route-3-dev_3.12.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-dev riscv64 0.3.30+ds-3+b1 [43.8 kB] Fetched 43.8 kB in 0s (825 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt2467n7k/libopenblas64-dev_0.3.30+ds-3+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp-riscv64-linux-gnu riscv64 4:15.2.0-5 [5320 B] Fetched 5320 B in 0s (100 kB/s) dpkg-name: info: moved 'cpp-riscv64-linux-gnu_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpbsi9olsu/cpp-riscv64-linux-gnu_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-mathjax all 2.7.9+dfsg-1 [5667 kB] Fetched 5667 kB in 0s (29.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprz6c_91d/libjs-mathjax_2.7.9+dfsg-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-pthreads-2.1-7t64 riscv64 2.1.12-stable-10+b2 [54.1 kB] Fetched 54.1 kB in 0s (1018 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy77qgqyk/libevent-pthreads-2.1-7t64_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-data all 2:1.8.12-1 [343 kB] Fetched 343 kB in 0s (5782 kB/s) dpkg-name: info: moved 'libx11-data_2%3a1.8.12-1_all.deb' to '/srv/rebuilderd/tmp/tmp11u6fhzg/libx11-data_1.8.12-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl-modules-5.40 all 5.40.1-7 [3012 kB] Fetched 3012 kB in 0s (25.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmonqtfv9/perl-modules-5.40_5.40.1-7_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc riscv64 4:15.2.0-5 [5156 B] Fetched 5156 B in 0s (262 kB/s) dpkg-name: info: moved 'gcc_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpmh3a__0h/gcc_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscotch-64i-dev riscv64 7.0.10-7 [19.7 kB] Fetched 19.7 kB in 0s (371 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpz3tah_y3/libscotch-64i-dev_7.0.10-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmagic1t64 riscv64 1:5.46-5+b1 [117 kB] Fetched 117 kB in 0s (5241 kB/s) dpkg-name: info: moved 'libmagic1t64_1%3a5.46-5+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp_u04vb5m/libmagic1t64_5.46-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran riscv64 4:15.2.0-5 [1424 B] Fetched 1424 B in 0s (27.4 kB/s) dpkg-name: info: moved 'gfortran_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpynq_wzmf/gfortran_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcombblas2.0.0t64 riscv64 2.0.0-7 [265 kB] Fetched 265 kB in 0s (4583 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkma5y6bl/libcombblas2.0.0t64_2.0.0-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpython3.13-minimal riscv64 3.13.11-1+b1 [862 kB] Fetched 862 kB in 0s (12.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvs774d6c/libpython3.13-minimal_3.13.11-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran-riscv64-linux-gnu riscv64 4:15.2.0-5 [1284 B] Fetched 1284 B in 0s (24.8 kB/s) dpkg-name: info: moved 'gfortran-riscv64-linux-gnu_4%3a15.2.0-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpg99rj0ze/gfortran-riscv64-linux-gnu_15.2.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp3-dev riscv64 1.12.0-1 [203 kB] Fetched 203 kB in 0s (8394 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6vyyz0ca/libnghttp3-dev_1.12.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 tar riscv64 1.35+dfsg-3.1 [822 kB] Fetched 822 kB in 0s (22.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_rxe0nw7/tar_1.35+dfsg-3.1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-5.8 riscv64 5.8.1-2 [1838 kB] Fetched 1838 kB in 0s (20.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp73jevx3_/libmumps-5.8_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpciaccess0 riscv64 0.17-3+b4 [51.4 kB] Fetched 51.4 kB in 0s (969 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_hv4sfar/libpciaccess0_0.17-3+b4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libaec0 riscv64 1.1.4-2+b1 [24.1 kB] Fetched 24.1 kB in 0s (459 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_93zerey/libaec0_1.1.4-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ocl-icd-opencl-dev riscv64 2.3.4-1 [8868 B] Fetched 8868 B in 0s (171 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbu3onqnm/ocl-icd-opencl-dev_2.3.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openmpi-bin riscv64 5.0.9-1 [192 kB] Fetched 192 kB in 0s (3360 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw2jngex8/openmpi-bin_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarchive-zip-perl all 1.68-1 [104 kB] Fetched 104 kB in 0s (4616 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa9ckajwb/libarchive-zip-perl_1.68-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libyaml-0-2 riscv64 0.2.5-2+b1 [56.2 kB] Fetched 56.2 kB in 0s (1055 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcejzv35y/libyaml-0-2_0.2.5-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libsz2 riscv64 1.1.4-2+b1 [8148 B] Fetched 8148 B in 0s (157 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp117np4wc/libsz2_1.1.4-2+b1_riscv64.deb' Downloading dependency 245 of 392: libmumps-dev:riscv64=5.8.1-2 Downloading dependency 246 of 392: media-types:riscv64=14.0.0 Downloading dependency 247 of 392: liblapack3:riscv64=3.12.1-7+b1 Downloading dependency 248 of 392: g++-riscv64-linux-gnu:riscv64=4:15.2.0-5 Downloading dependency 249 of 392: liblapack64-3:riscv64=3.12.1-7+b1 Downloading dependency 250 of 392: mpi-default-bin:riscv64=1.20 Downloading dependency 251 of 392: libgfortran-15-dev:riscv64=15.2.0-12 Downloading dependency 252 of 392: libsepol2:riscv64=3.9-2 Downloading dependency 253 of 392: libibverbs-dev:riscv64=61.0-2 Downloading dependency 254 of 392: libssh2-1t64:riscv64=1.11.1-1+b1 Downloading dependency 255 of 392: libscotch-64i-7.0:riscv64=7.0.10-7 Downloading dependency 256 of 392: libctf-nobfd0:riscv64=2.45.50.20260119-1 Downloading dependency 257 of 392: libelf1t64:riscv64=0.194-1 Downloading dependency 258 of 392: liblzma5:riscv64=5.8.2-2 Downloading dependency 259 of 392: cpp-15:riscv64=15.2.0-12 Downloading dependency 260 of 392: libsuitesparse-mongoose3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 261 of 392: libasan8:riscv64=15.2.0-12 Downloading dependency 262 of 392: groff-base:riscv64=1.23.0-10 Downloading dependency 263 of 392: libnuma-dev:riscv64=2.0.19-1+b1 Downloading dependency 264 of 392: libpetsc-complex3.24-dev:riscv64=3.24.3+dfsg1-1 Downloading dependency 265 of 392: xtrans-dev:riscv64=1.6.0-1 Downloading dependency 266 of 392: libitm1:riscv64=15.2.0-12 Downloading dependency 267 of 392: libbz2-1.0:riscv64=1.0.8-6+b1 Downloading dependency 268 of 392: libhdf5-openmpi-hl-310:riscv64=1.14.6+repack-2 Downloading dependency 269 of 392: libgmp-dev:riscv64=2:6.3.0+dfsg-5+b1 Downloading dependency 270 of 392: adduser:riscv64=3.154 Downloading dependency 271 of 392: libfftw3-bin:riscv64=3.3.10-2+b2 Downloading dependency 272 of 392: libctf0:riscv64=2.45.50.20260119-1 Downloading dependency 273 of 392: libsemanage2:riscv64=3.9-1+b1 Downloading dependency 274 of 392: m4:riscv64=1.4.20-2 Downloading dependency 275 of 392: libacl1:riscv64=2.3.2-2+b2 Downloading dependency 276 of 392: libp11-kit0:riscv64=0.25.10-1+b1 Downloading dependency 277 of 392: libptscotch-7.0c:riscv64=7.0.10-7 Downloading dependency 278 of 392: libgnutls28-dev:riscv64=3.8.11-3 Downloading dependency 279 of 392: libbrotli1:riscv64=1.1.0-2+b9 Downloading dependency 280 of 392: libgnutls30t64:riscv64=3.8.11-3 Downloading dependency 281 of 392: libuchardet0:riscv64=0.0.8-2+b1 Downloading dependency 282 of 392: xz-utils:riscv64=5.8.2-2 Downloading dependency 283 of 392: libdpkg-perl:riscv64=1.23.5 Downloading dependency 284 of 392: libuuid1:riscv64=2.41.3-3 Downloading dependency 285 of 392: libdebhelper-perl:riscv64=13.29 Downloading dependency 286 of 392: libpython3.13-stdlib:riscv64=3.13.11-1+b1 Downloading dependency 287 of 392: binutils-riscv64-linux-gnu:riscv64=2.45.50.20260119-1 Downloading dependency 288 of 392: libngtcp2-crypto-ossl-dev:riscv64=1.16.0-1 Downloading dependency 289 of 392: libpython3-stdlib:riscv64=3.13.9-3 Downloading dependency 290 of 392: libzstd-dev:riscv64=1.5.7+dfsg-3 Downloading dependency 291 of 392: cpp-15-riscv64-linux-gnu:riscv64=15.2.0-12 Downloading dependency 292 of 392: man-db:riscv64=2.13.1-1 Downloading dependency 293 of 392: libptscotcherr-dev:riscv64=7.0.10-7 Downloading dependency 294 of 392: libfabric1:riscv64=2.1.0-1.1+b1 Downloading dependency 295 of 392: libhdf5-openmpi-fortran-310:riscv64=1.14.6+repack-2 Downloading dependency 296 of 392: libjs-jquery-ui:riscv64=1.13.2+dfsg-1 Downloading dependency 297 of 392: debhelper:riscv64=13.29 Downloading dependency 298 of 392: libgdbm-compat4t64:riscv64=1.26-1+b1 Downloading dependency 299 of 392: libsuperlu7:riscv64=7.0.1+dfsg1-2+b1 Downloading dependency 300 of 392: libsystemd0:riscv64=259-1 Downloading dependency 301 of 392: libptscotcherr-7.0:riscv64=7.0.10-7 Downloading dependency 302 of 392: libfftw3-mpi3:riscv64=3.3.10-2+b2 Downloading dependency 303 of 392: autotools-dev:riscv64=20240727.1 Downloading dependency 304 of 392: libtinfo6:riscv64=6.6+20251231-1 Downloading dependency 305 of 392: libpetsc64-complex3.24-dev:riscv64=3.24.3+dfsg1-1 Downloading dependency 306 of 392: libxml2-16:riscv64=2.15.1+dfsg-2+b1 Downloading dependency 307 of 392: libcholmod5:riscv64=1:7.12.1+dfsg-1 Downloading dependency 308 of 392: autopoint:riscv64=0.23.2-1 Downloading dependency 309 of 392: dh-strip-nondeterminism:riscv64=1.15.0-1 Downloading dependency 310 of 392: rpcsvc-proto:riscv64=1.4.3-1+b2 Downloading dependency 311 of 392: libidn2-dev:riscv64=2.3.8-4+b1 Downloading dependency 312 of 392: libsuitesparse-dev:riscv64=1:7.12.1+dfsg-1 Downloading dependency 313 of 392: libldap-dev:riscv64=2.6.10+dfsg-1+b1 Downloading dependency 314 of 392: libkrb5-3:riscv64=1.22.1-2 Downloading dependency 315 of 392: file:riscv64=1:5.46-5+b1 Downloading dependency 316 of 392: libngtcp2-crypto-ossl0:riscv64=1.16.0-1 Downloading dependency 317 of 392: dpkg-dev:riscv64=1.23.5 Downloading dependency 318 of 392: init-system-helpers:riscv64=1.69 Downloading dependency 319 of 392: libsuperlu-dist-dev:riscv64=9.2.1+dfsg1-1 Downloading dependency 320 of 392: libxcb1-dev:riscv64=1.17.0-2+b2 Downloading dependency 321 of 392: chrpath:riscv64=0.18-1 Downloading dependency 322 of 392: ncurses-bin:riscv64=6.6+20251231-1 Downloading dependency 323 of 392: python3-magic:riscv64=2:0.4.27-3 Downloading dependency 324 of 392: libibumad3:riscv64=61.0-2 Downloading dependency 325 of 392: libpetsc64-complex3.24:riscv64=3.24.3+dfsg1-1 Downloading dependency 326 of 392: librbio4:riscv64=1:7.12.1+dfsg-1 Downloading dependency 327 of 392: libgssapi-krb5-2:riscv64=1.22.1-2 Downloading dependency 328 of 392: libaudit-common:riscv64=1:4.1.2-1 Downloading dependency 329 of 392: bsdextrautils:riscv64=2.41.3-3 Downloading dependency 330 of 392: po-debconf:riscv64=1.0.22 Downloading dependency 331 of 392: xorg-sgml-doctools:riscv64=1:1.11-1.1 Downloading dependency 332 of 392: libaudit1:riscv64=1:4.1.2-1+b1 Downloading dependency 333 of 392: gettext-base:riscv64=0.23.2-1 Downloading dependency 334 of 392: libgnutls-dane0t64:riscv64=3.8.11-3 Downloading dependency 335 of 392: sysvinit-utils:riscv64=3.15-6 Downloading dependency 336 of 392: libperl5.40:riscv64=5.40.1-7 Downloading dependency 337 of 392: libgcc-s1:riscv64=15.2.0-12 Downloading dependency 338 of 392: libidn2-0:riscv64=2.3.8-4+b1 Downloading dependency 339 of 392: libpetsc3.24-dev-examples:riscv64=3.24.3+dfsg1-1 Downloading dependency 340 of 392: libnl-route-3-dev:riscv64=3.12.0-2 Downloading dependency 341 of 392: libopenblas64-dev:riscv64=0.3.30+ds-3+b1 Downloading dependency 342 of 392: cpp-riscv64-linux-gnu:riscv64=4:15.2.0-5 Downloading dependency 343 of 392: libjs-mathjax:riscv64=2.7.9+dfsg-1 Downloading dependency 344 of 392: libevent-pthreads-2.1-7t64:riscv64=2.1.12-stable-10+b2 Downloading dependency 345 of 392: libx11-data:riscv64=2:1.8.12-1 Downloading dependency 346 of 392: perl-modules-5.40:riscv64=5.40.1-7 Downloading dependency 347 of 392: gcc:riscv64=4:15.2.0-5 Downloading dependency 348 of 392: libscotch-64i-dev:riscv64=7.0.10-7 Downloading dependency 349 of 392: libmagic1t64:riscv64=1:5.46-5+b1 Downloading dependency 350 of 392: gfortran:riscv64=4:15.2.0-5 Downloading dependency 351 of 392: libcombblas2.0.0t64:riscv64=2.0.0-7 Downloading dependency 352 of 392: libpython3.13-minimal:riscv64=3.13.11-1+b1 Downloading dependency 353 of 392: gfortran-riscv64-linux-gnu:riscv64=4:15.2.0-5 Downloading dependency 354 of 392: libnghttp3-dev:riscv64=1.12.0-1 Downloading dependency 355 of 392: tar:riscv64=1.35+dfsg-3.1 Downloading dependency 356 of 392: libmumps-5.8:riscv64=5.8.1-2 Downloading dependency 357 of 392: libpciaccess0:riscv64=0.17-3+b4 Downloading dependency 358 of 392: libaec0:riscv64=1.1.4-2+b1 Downloading dependency 359 of 392: ocl-icd-opencl-dev:riscv64=2.3.4-1 Downloading dependency 360 of 392: openmpi-bin:riscv64=5.0.9-1 Downloading dependency 361 of 392: libarchive-zip-perl:riscv64=1.68-1 Downloading dependency 362 of 392: libyaml-0-2:riscv64=0.2.5-2+b1 Downloading dependency 363 of 392: libsz2:riscv64=1.1.4-2+b1 Downloading dependency 364 of 392: pkgconf-bin:riscv64=1.8.1-4+b1Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 pkgconf-bin riscv64 1.8.1-4+b1 [30.0 kB] Fetched 30.0 kB in 0s (573 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwaofxrku/pkgconf-bin_1.8.1-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dpkg riscv64 1.23.5 [1528 kB] Fetched 1528 kB in 0s (29.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplb0h9mf4/dpkg_1.23.5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-modules riscv64 1.7.0-5+b1 [176 kB] Fetched 176 kB in 0s (3091 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9_x5ducd/libpam-modules_1.7.0-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfuse3-4 riscv64 3.18.1-1 [101 kB] Fetched 101 kB in 0s (1034 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprt0mdj99/libfuse3-4_3.18.1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++-15-riscv64-linux-gnu riscv64 15.2.0-12 [15.8 MB] Fetched 15.8 MB in 0s (40.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9hxq3fty/g++-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblsan0 riscv64 15.2.0-12 [1326 kB] Fetched 1326 kB in 0s (15.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0ez5k_u4/liblsan0_15.2.0-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-dev riscv64 2.1.12-stable-10+b2 [571 kB] Fetched 571 kB in 0s (8852 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpr8jxnp1s/libevent-dev_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gettext riscv64 0.23.2-1 [1684 kB] Fetched 1684 kB in 0s (18.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd69aswpk/gettext_0.23.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libstdc++-15-dev riscv64 15.2.0-12 [6161 kB] Fetched 6161 kB in 0s (33.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq5mkt9tz/libstdc++-15-dev_15.2.0-12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libc6-dev riscv64 2.42-10+b1 [3438 kB] Fetched 3438 kB in 0s (26.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpp280nv59/libc6-dev_2.42-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libscalapack-mpi-dev riscv64 2.2.2-5 [6796 B] Fetched 6796 B in 0s (130 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5ut2vqgd/libscalapack-mpi-dev_2.2.2-5_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20260126T022502Z sid/main riscv64 libssl-dev riscv64 3.5.4-1+b1 [6298 kB] Fetched 6298 kB in 0s (22.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw8g68brv/libssl-dev_3.5.4-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libumfpack6 riscv64 1:7.12.1+dfsg-1 [262 kB] Fetched 262 kB in 0s (4542 kB/s) dpkg-name: info: moved 'libumfpack6_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpozawak9x/libumfpack6_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librdmacm1t64 riscv64 61.0-2 [74.9 kB] Fetched 74.9 kB in 0s (1387 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpm3ng67mw/librdmacm1t64_61.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkeyutils1 riscv64 1.6.3-6+b1 [9672 B] Fetched 9672 B in 0s (490 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp62g7noxh/libkeyutils1_1.6.3-6+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-6 riscv64 2:1.8.12-1+b1 [821 kB] Fetched 821 kB in 0s (11.8 MB/s) dpkg-name: info: moved 'libx11-6_2%3a1.8.12-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpq058gj6s/libx11-6_1.8.12-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autoconf all 2.72-3.1 [494 kB] Fetched 494 kB in 0s (16.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8c5mw85e/autoconf_2.72-3.1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre-3.0.0 riscv64 3.0.0-5 [1746 kB] Fetched 1746 kB in 0s (17.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6o78xyt0/libhypre-3.0.0_3.0.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam0g riscv64 1.7.0-5+b1 [70.7 kB] Fetched 70.7 kB in 0s (3327 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp78xmqqq7/libpam0g_1.7.0-5+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libspex3 riscv64 1:7.12.1+dfsg-1 [69.2 kB] Fetched 69.2 kB in 0s (1262 kB/s) dpkg-name: info: moved 'libspex3_1%3a7.12.1+dfsg-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpq7w_65h6/libspex3_7.12.1+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 diffutils riscv64 1:3.12-1 [405 kB] Fetched 405 kB in 0s (14.3 MB/s) dpkg-name: info: moved 'diffutils_1%3a3.12-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpx39k6rdc/diffutils_3.12-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libselinux1 riscv64 3.9-4+b1 [89.1 kB] Fetched 89.1 kB in 0s (4175 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpu_fsfk0a/libselinux1_3.9-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-openssl-2.1-7t64 riscv64 2.1.12-stable-10+b2 [60.6 kB] Fetched 60.6 kB in 0s (1128 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplql4x21n/libevent-openssl-2.1-7t64_2.1.12-stable-10+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 hostname riscv64 3.25 [10.7 kB] Fetched 10.7 kB in 0s (545 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppkevybg1/hostname_3.25_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5-dev riscv64 1.22.1-2 [16.2 kB] Fetched 16.2 kB in 0s (309 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfba0f_kw/libkrb5-dev_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre64-3.0.0 riscv64 3.0.0-5 [1677 kB] Fetched 1677 kB in 0s (18.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp11l1wuo7/libhypre64-3.0.0_3.0.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-fortran-310 riscv64 1.14.6+repack-2 [39.0 kB] Fetched 39.0 kB in 0s (740 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprl_nnwep/libhdf5-openmpi-hl-fortran-310_1.14.6+repack-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpetsc64-real3.24-dev riscv64 3.24.3+dfsg1-1 [13.6 MB] Fetched 13.6 MB in 1s (25.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3mxh3o5b/libpetsc64-real3.24-dev_3.24.3+dfsg1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 bash riscv64 5.3-1 [1560 kB] Fetched 1560 kB in 0s (29.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzq7rg0h4/bash_5.3-1_riscv64.deb' dpkg-buildpackage: info: source package debootsnap-dummy dpkg-buildpackage: info: source version 1.0 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Equivs Dummy Package Generator dpkg-source --before-build . dpkg-buildpackage: info: host architecture riscv64 debian/rules clean dh clean dh_clean debian/rules binary dh binary dh_update_autotools_config dh_autoreconf create-stamp debian/debhelper-build-stamp dh_prep dh_auto_install --destdir=debian/debootsnap-dummy/ dh_install dh_installdocs dh_installchangelogs dh_perl dh_link dh_strip_nondeterminism dh_compress dh_fixperms dh_missing dh_installdeb dh_gencontrol dh_md5sums dh_builddeb dpkg-deb: building package 'debootsnap-dummy' in '../debootsnap-dummy_1.0_all.deb'. dpkg-genbuildinfo --build=binary -O../debootsnap-dummy_1.0_riscv64.buildinfo dpkg-genchanges --build=binary -O../debootsnap-dummy_1.0_riscv64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) The package has been created. Attention, the package has been created in the /srv/rebuilderd/tmp/tmphu8z1ior/cache directory, not in ".." as indicated by the message above! I: automatically chosen mode: unshare I: chroot architecture riscv64 is equal to the host's architecture I: using /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW as tempdir I: running --setup-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/setup00.sh /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW 127.0.0.1 - - [31/Jan/2026 20:57:05] code 404, message File not found 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./InRelease HTTP/1.1" 404 - Ign:1 http://localhost:38093 ./ InRelease 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./Release HTTP/1.1" 200 - Get:2 http://localhost:38093 ./ Release [462 B] 127.0.0.1 - - [31/Jan/2026 20:57:05] code 404, message File not found 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./Release.gpg HTTP/1.1" 404 - Ign:3 http://localhost:38093 ./ Release.gpg 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./Packages HTTP/1.1" 200 - Get:4 http://localhost:38093 ./ Packages [502 kB] Fetched 503 kB in 0s (9130 kB/s) Reading package lists... usr-is-merged found but not real -- not running merged-usr setup hook I: skipping apt-get update because it was already run I: downloading packages with apt... 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./gcc-15-base_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libc-gconv-modules-extra_2.42-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libc6_2.42-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libgcc-s1_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./mawk_1.3.4.20250131-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./base-files_14_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libtinfo6_6.6%2b20251231-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./debianutils_5.23.2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./bash_5.3-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libacl1_2.3.2-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libattr1_2.5.2-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libcap2_2.75-10%2bb5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libgmp10_6.3.0%2bdfsg-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libpcre2-8-0_10.46-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libselinux1_3.9-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libzstd1_1.5.7%2bdfsg-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./zlib1g_1.3.dfsg%2breally1.3.1-1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libssl3t64_3.5.4-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./openssl-provider-legacy_3.5.4-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libsystemd0_259-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./coreutils_9.7-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./dash_0.5.12-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./diffutils_3.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libbz2-1.0_1.0.8-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./liblzma5_5.8.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libmd0_1.1.0-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./tar_1.35%2bdfsg-3.1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./dpkg_1.23.5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./findutils_4.10.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./grep_3.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./gzip_1.13-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./hostname_3.25_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./ncurses-bin_6.6%2b20251231-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./libcrypt1_4.5.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:05] "GET /./perl-base_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./sed_4.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libaudit-common_4.1.2-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libcap-ng0_0.8.5-4%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libaudit1_4.1.2-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libdb5.3t64_5.3.28%2bdfsg2-11_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./debconf_1.5.91_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libpam0g_1.7.0-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libpam-modules-bin_1.7.0-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libpam-modules_1.7.0-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libpam-runtime_1.7.0-5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libblkid1_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libmount1_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libsmartcols1_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libudev1_259-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libuuid1_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./util-linux_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libdebconfclient0_0.282%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./base-passwd_3.6.8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./init-system-helpers_1.69_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./libc-bin_2.42-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./ncurses-base_6.6%2b20251231-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:06] "GET /./sysvinit-utils_3.15-6_riscv64.deb HTTP/1.1" 200 - I: extracting archives... I: running --extract-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/extract00.sh /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW 127.0.0.1 - - [31/Jan/2026 20:57:09] code 404, message File not found 127.0.0.1 - - [31/Jan/2026 20:57:09] "GET /./InRelease HTTP/1.1" 404 - Ign:1 http://localhost:38093 ./ InRelease 127.0.0.1 - - [31/Jan/2026 20:57:09] "GET /./Release HTTP/1.1" 304 - Hit:2 http://localhost:38093 ./ Release 127.0.0.1 - - [31/Jan/2026 20:57:09] code 404, message File not found 127.0.0.1 - - [31/Jan/2026 20:57:09] "GET /./Release.gpg HTTP/1.1" 404 - Ign:3 http://localhost:38093 ./ Release.gpg Reading package lists... usr-is-merged found but not real -- not running merged-usr extract hook I: installing essential packages... I: running --essential-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/essential00.sh /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW usr-is-merged was not installed in a previous hook -- not running merged-usr essential hook I: installing remaining packages inside the chroot... 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libexpat1_2.7.3-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libpython3.13-minimal_3.13.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./python3.13-minimal_3.13.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./python3-minimal_3.13.9-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./media-types_14.0.0_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./netbase_6.5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./tzdata_2025c-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libffi8_3.5.2-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libncursesw6_6.6%2b20251231-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./readline-common_8.3-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libreadline8t64_8.3-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libsqlite3-0_3.46.1-9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libpython3.13-stdlib_3.13.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./python3.13_3.13.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libpython3-stdlib_3.13.9-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./python3_3.13.9-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./login.defs_4.19.0-4_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libbsd0_0.12.2-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libsemanage-common_3.9-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:24] "GET /./libsepol2_3.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libsemanage2_3.9-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./passwd_4.19.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./sensible-utils_0.0.26_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./adduser_3.154_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libstdc%2b%2b6_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libuchardet0_0.0.8-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./groff-base_1.23.0-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./bsdextrautils_2.41.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libgdbm6t64_1.26-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libpipeline1_1.5.8-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./man-db_2.13.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./bzip2_1.0.8-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libmagic-mgc_5.46-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libmagic1t64_5.46-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./file_5.46-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./gettext-base_0.23.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libedit2_3.1-20251016-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libcbor0.10_0.10.2-2.1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libfido2-1_1.16.0-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libkrb5support0_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libcom-err2_1.47.2-3%2bb8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libk5crypto3_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libkeyutils1_1.6.3-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libkrb5-3_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libgssapi-krb5-2_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./openssh-client_10.2p1-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./perl-modules-5.40_5.40.1-7_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libgdbm-compat4t64_1.26-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libperl5.40_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./perl_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./xz-utils_5.8.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./m4_1.4.20-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./autoconf_2.72-3.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./autotools-dev_20240727.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./automake_1.18.1-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./autopoint_0.23.2-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libsframe3_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./binutils-common_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libbinutils_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libgprofng0_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libctf-nobfd0_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libctf0_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libjansson4_2.14-2%2bb4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./binutils-riscv64-linux-gnu_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./binutils_2.45.50.20260119-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libc-dev-bin_2.42-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./linux-libc-dev_6.18.5-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./rpcsvc-proto_1.4.3-1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:25] "GET /./libc6-dev_2.42-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libisl23_0.27-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libmpfr6_4.2.2-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libmpc3_1.3.1-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./cpp-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./cpp-15_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./cpp-riscv64-linux-gnu_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./cpp_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libcc1-0_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libgomp1_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libitm1_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libatomic1_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libasan8_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./liblsan0_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libtsan2_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libubsan1_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:26] "GET /./libgcc-15-dev_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:27] "GET /./gcc-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./gcc-15_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./gcc-riscv64-linux-gnu_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./gcc_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libstdc%2b%2b-15-dev_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./g%2b%2b-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./g%2b%2b-15_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./g%2b%2b-riscv64-linux-gnu_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./g%2b%2b_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./make_4.4.1-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libdpkg-perl_1.23.5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./patch_2.8-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./dpkg-dev_1.23.5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./build-essential_12.12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./chrpath_0.18-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./comerr-dev_2.1-1.47.2-3%2bb8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libdebhelper-perl_13.29_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libtool_2.5.4-9_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./dh-autoreconf_21_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libarchive-zip-perl_1.68-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libfile-stripnondeterminism-perl_1.15.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./dh-strip-nondeterminism_1.15.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libelf1t64_0.194-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./dwz_0.16-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libunistring5_1.3-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./libxml2-16_2.15.1%2bdfsg-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:28] "GET /./gettext_0.23.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./intltool-debian_0.35.0%2b20060710.6_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./po-debconf_1.0.22_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./debhelper_13.29_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libblas3_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libgfortran5_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./liblapack3_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libarpack2t64_3.9.1-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libevent-core-2.1-7t64_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libevent-pthreads-2.1-7t64_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libnl-3-200_3.12.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libnl-route-3-200_3.12.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libibverbs1_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./ibverbs-providers_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./librdmacm1t64_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libfabric1_2.1.0-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libhwloc15_2.12.2-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libmunge2_0.5.16-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libpciaccess0_0.17-3%2bb4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libxau6_1.0.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libxdmcp6_1.1.5-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libxcb1_1.17.0-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libx11-data_1.8.12-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libx11-6_1.8.12-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libxext6_1.3.4-1%2bb4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libxnvctrl0_535.171.04-1%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./ocl-icd-libopencl1_2.3.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libhwloc-plugins_2.12.2-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libpmix2t64_6.0.0%2breally5.0.9-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libfuse3-4_3.18.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libibumad3_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libibmad5_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libucx0_1.20.0%2bds-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libopenmpi40_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libparpack2t64_3.9.1-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libbrotli1_1.1.0-2%2bb9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libbrotli-dev_1.1.0-2%2bb9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libsuitesparseconfig7_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libcxsparse4_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libblas64-3_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libopenblas64-0-pthread_0.3.30%2bds-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./liblapack64-3_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libcombblas2.0.0t64_2.0.0-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libmetis5_5.1.0.dfsg-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libptscotcherr-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libscotcherr-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libscotch-7.0c_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libptscotch-7.0c_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libsuperlu-dist9_9.2.1%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libhypre64-3.0.0_3.0.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libopenblas64-0_0.3.30%2bds-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:29] "GET /./libopenblas64-pthread-dev_0.3.30%2bds-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libopenblas64-dev_0.3.30%2bds-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libsuperlu7_7.0.1%2bdfsg1-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libblas-dev_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libsuperlu-dev_7.0.1%2bdfsg1-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libgfortran-15-dev_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./gfortran-15-riscv64-linux-gnu_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./gfortran-15_15.2.0-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./gfortran-riscv64-linux-gnu_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./gfortran_15.2.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./openmpi-common_5.0.9-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libnl-3-dev_3.12.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libnl-route-3-dev_3.12.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libibverbs-dev_61.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libnuma1_2.0.19-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libnuma-dev_2.0.19-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libltdl7_2.5.4-9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libltdl-dev_2.5.4-9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libhwloc-dev_2.12.2-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libevent-2.1-7t64_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libevent-extra-2.1-7t64_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libevent-openssl-2.1-7t64_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libevent-dev_2.1.12-stable-10%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libjs-jquery_3.7.1%2bdfsg%2b%7e3.5.33-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libjs-jquery-ui_1.13.2%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./zlib1g-dev_1.3.dfsg%2breally1.3.1-1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./openmpi-bin_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libopenmpi-dev_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./mpi-default-dev_1.20_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:30] "GET /./libhypre64-dev_3.0.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libcamd3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libfortran-toml-0_0.4.3-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libfortran-jonquil-0_0.3.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./fortran-fpm_0.12.0-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./liblapack-dev_3.12.1-7%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libarpack2-dev_3.9.1-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libscotch-64i-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libptscotch-64i-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libnettle8t64_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libhogweed6t64_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libidn2-0_2.3.8-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libp11-kit0_0.25.10-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libtasn1-6_4.21.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libgnutls30t64_3.8.11-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./librtmp1_2.4%2b20151223.gitfa8646d.1-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./fonts-mathjax_2.7.9%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./xorg-sgml-doctools_1.11-1.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./x11proto-dev_2024.1-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libxdmcp-dev_1.1.5-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libxau-dev_1.0.11-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./dh-python_7.20260125_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libjpeg62-turbo_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libjpeg62-turbo-dev_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libjpeg-dev_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libbtf2_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libamd3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libccolamd3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libcolamd3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libcholmod5_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libklu2_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libpkgconf3_1.8.1-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libgssrpc4t64_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libkdb5-10t64_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:31] "GET /./libjs-mathjax_2.7.9%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libldl3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libsuitesparse-mongoose3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libumfpack6_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./librbio4_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libspqr4_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libspex3_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libparu1_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libsuitesparse-dev_7.12.1%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libptscotcherr-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libscotcherr-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libscotch-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libptscotch-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./mpi-default-bin_1.20_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libscalapack-openmpi2.2_2.2.2-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libscalapack-openmpi-dev_2.2.2-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libscalapack-mpi-dev_2.2.2-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-double3_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-long3_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-single3_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-bin_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-dev_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-mpi3_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libfftw3-mpi-dev_3.3.10-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:32] "GET /./libssl-dev_3.5.4-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./opencl-c-headers_3.0%7e2025.07.22-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./opencl-clhpp-headers_3.0%7e2025.07.22-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./ocl-icd-opencl-dev_2.3.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libnghttp3-9_1.12.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libngtcp2-16_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libsasl2-modules-db_2.1.28%2bdfsg1-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libsasl2-2_2.1.28%2bdfsg1-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libldap2_2.6.10%2bdfsg-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libnghttp2-14_1.64.0-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libngtcp2-crypto-ossl0_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libpsl5t64_0.21.2-1.1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libssh2-1t64_1.11.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libcurl4t64_8.18.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libaec0_1.1.4-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libsz2_1.1.4-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-fortran-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-hl-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-hl-fortran-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-cpp-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-hl-cpp-310_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libaec-dev_1.1.4-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libidn2-dev_2.3.8-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libkadm5clnt-mit12_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libkadm5srv-mit12_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./krb5-multidev_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libkrb5-dev_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libldap-dev_2.6.10%2bdfsg-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./pkgconf-bin_1.8.1-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./pkgconf_1.8.1-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libnghttp2-dev_1.64.0-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libnghttp3-dev_1.12.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libngtcp2-crypto-ossl-dev_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libngtcp2-dev_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libpsl-dev_0.21.2-1.1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libgmpxx4ldbl_6.3.0%2bdfsg-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libgmp-dev_6.3.0%2bdfsg-5%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libunbound8_1.24.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libgnutls-dane0t64_3.8.11-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libgnutls-openssl27t64_3.8.11-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libp11-kit-dev_0.25.10-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libtasn1-6-dev_4.21.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./nettle-dev_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libgnutls28-dev_3.8.11-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./librtmp-dev_2.4%2b20151223.gitfa8646d.1-3%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libssh2-1-dev_1.11.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libzstd-dev_1.5.7%2bdfsg-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libcurl4-openssl-dev_8.18.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-openmpi-dev_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libhdf5-mpi-dev_1.14.6%2brepack-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:33] "GET /./libsuperlu-dist-dev_9.2.1%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./xtrans-dev_1.6.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libxcb1-dev_1.17.0-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libx11-dev_1.8.12-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libyaml-0-2_0.2.5-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libyaml-dev_0.2.5-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libpetsc3.24-dev-common_3.24.3%2bdfsg1-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libhypre-3.0.0_3.0.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libmumps-5.8_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libpetsc-real3.24_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libscotch-64-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libptscotch-64-7.0_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libhypre-dev_3.0.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libmumps-headers-dev_5.8.1-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libmumps-64pord-5.8_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:34] "GET /./libmumps64-dev_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./python3-magic_0.4.27-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./python3-click_8.2.0%2b0.really.8.1.8-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./patchelf_0.18.0-1.4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libfile-libmagic-perl_1.23-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./dh-fortran_0.63_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libparpack2-dev_3.9.1-6%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libscotch-64-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libptscotch-64-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libscotch-64i-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libptscotch-64i-dev_7.0.10-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libpetsc64-real3.24_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libmumps-dev_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:35] "GET /./libpetsc-real3.24-dev_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:36] "GET /./libpetsc-complex3.24_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:36] "GET /./libpetsc-complex3.24-dev_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:36] "GET /./libpetsc64-complex3.24_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:36] "GET /./libpetsc64-complex3.24-dev_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:37] "GET /./libpetsc3.24-dev-examples_3.24.3%2bdfsg1-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:37] "GET /./libpetsc64-real3.24-dev_3.24.3%2bdfsg1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [31/Jan/2026 20:57:37] "GET /./debootsnap-dummy_1.0_all.deb HTTP/1.1" 200 - I: running --customize-hook directly: /srv/rebuilderd/tmp/tmphu8z1ior/apt_install.sh /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW Reading package lists... Building dependency tree... Reading state information... openssl-provider-legacy is already the newest version (3.5.4-1+b1). mawk is already the newest version (1.3.4.20250131-2). libparpack2t64 is already the newest version (3.9.1-6+b1). libparpack2t64 set to manually installed. libxdmcp6 is already the newest version (1:1.1.5-2). libxdmcp6 set to manually installed. libzstd1 is already the newest version (1.5.7+dfsg-3). libbrotli-dev is already the newest version (1.1.0-2+b9). libbrotli-dev set to manually installed. libcxsparse4 is already the newest version (1:7.12.1+dfsg-1). libcxsparse4 set to manually installed. libxcb1 is already the newest version (1.17.0-2+b2). libxcb1 set to manually installed. libhypre64-dev is already the newest version (3.0.0-5). libhypre64-dev set to manually installed. libcom-err2 is already the newest version (1.47.2-3+b8). libcom-err2 set to manually installed. libcamd3 is already the newest version (1:7.12.1+dfsg-1). libcamd3 set to manually installed. libucx0 is already the newest version (1.20.0+ds-4). libucx0 set to manually installed. libgmp10 is already the newest version (2:6.3.0+dfsg-5+b1). libcbor0.10 is already the newest version (0.10.2-2.1). libcbor0.10 set to manually installed. libfortran-toml-0 is already the newest version (0.4.3-2). libfortran-toml-0 set to manually installed. fortran-fpm is already the newest version (0.12.0-6). fortran-fpm set to manually installed. libarpack2-dev is already the newest version (3.9.1-6+b1). libarpack2-dev set to manually installed. libmunge2 is already the newest version (0.5.16-1+b1). libmunge2 set to manually installed. dh-autoreconf is already the newest version (21). dh-autoreconf set to manually installed. comerr-dev is already the newest version (2.1-1.47.2-3+b8). comerr-dev set to manually installed. libptscotch-64i-7.0 is already the newest version (7.0.10-7). libptscotch-64i-7.0 set to manually installed. librtmp1 is already the newest version (2.4+20151223.gitfa8646d.1-3+b1). librtmp1 set to manually installed. gcc-15-base is already the newest version (15.2.0-12). fonts-mathjax is already the newest version (2.7.9+dfsg-1). fonts-mathjax set to manually installed. libatomic1 is already the newest version (15.2.0-12). libatomic1 set to manually installed. libxdmcp-dev is already the newest version (1:1.1.5-2). libxdmcp-dev set to manually installed. libssl3t64 is already the newest version (3.5.4-1+b1). libxau-dev is already the newest version (1:1.0.11-1+b1). libxau-dev set to manually installed. dh-python is already the newest version (7.20260125). dh-python set to manually installed. libxau6 is already the newest version (1:1.0.11-1+b1). libxau6 set to manually installed. libpipeline1 is already the newest version (1.5.8-2). libpipeline1 set to manually installed. libjpeg-dev is already the newest version (1:2.1.5-4). libjpeg-dev set to manually installed. libbtf2 is already the newest version (1:7.12.1+dfsg-1). libbtf2 set to manually installed. libklu2 is already the newest version (1:7.12.1+dfsg-1). libklu2 set to manually installed. libxnvctrl0 is already the newest version (535.171.04-1+b3). libxnvctrl0 set to manually installed. libltdl-dev is already the newest version (2.5.4-9). libltdl-dev set to manually installed. libattr1 is already the newest version (1:2.5.2-3+b1). debconf is already the newest version (1.5.91). dash is already the newest version (0.5.12-12). libopenmpi40 is already the newest version (5.0.9-1). libopenmpi40 set to manually installed. gfortran-15-riscv64-linux-gnu is already the newest version (15.2.0-12). gfortran-15-riscv64-linux-gnu set to manually installed. libk5crypto3 is already the newest version (1.22.1-2). libk5crypto3 set to manually installed. libpkgconf3 is already the newest version (1.8.1-4+b1). libpkgconf3 set to manually installed. libkdb5-10t64 is already the newest version (1.22.1-2). libkdb5-10t64 set to manually installed. libpetsc3.24-dev-common is already the newest version (3.24.3+dfsg1-1). libpetsc3.24-dev-common set to manually installed. zlib1g-dev is already the newest version (1:1.3.dfsg+really1.3.1-1+b2). zlib1g-dev set to manually installed. libnghttp2-dev is already the newest version (1.64.0-1.1+b1). libnghttp2-dev set to manually installed. libscalapack-openmpi2.2 is already the newest version (2.2.2-5). libscalapack-openmpi2.2 set to manually installed. libgdbm6t64 is already the newest version (1.26-1+b1). libgdbm6t64 set to manually installed. libkrb5support0 is already the newest version (1.22.1-2). libkrb5support0 set to manually installed. libcurl4t64 is already the newest version (8.18.0-2). libcurl4t64 set to manually installed. libpetsc-real3.24 is already the newest version (3.24.3+dfsg1-1). libpetsc-real3.24 set to manually installed. libblas-dev is already the newest version (3.12.1-7+b1). libblas-dev set to manually installed. libngtcp2-dev is already the newest version (1.16.0-1). libngtcp2-dev set to manually installed. libptscotch-64-7.0 is already the newest version (7.0.10-7). libptscotch-64-7.0 set to manually installed. libopenblas64-0-pthread is already the newest version (0.3.30+ds-3+b1). libopenblas64-0-pthread set to manually installed. libgmpxx4ldbl is already the newest version (2:6.3.0+dfsg-5+b1). libgmpxx4ldbl set to manually installed. libhypre-dev is already the newest version (3.0.0-5). libhypre-dev set to manually installed. opencl-clhpp-headers is already the newest version (3.0~2025.07.22-1). opencl-clhpp-headers set to manually installed. libfftw3-double3 is already the newest version (3.3.10-2+b2). libfftw3-double3 set to manually installed. libmumps64-dev is already the newest version (5.8.1-2). libmumps64-dev set to manually installed. libhdf5-openmpi-cpp-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-cpp-310 set to manually installed. libyaml-dev is already the newest version (0.2.5-2+b1). libyaml-dev set to manually installed. base-files is already the newest version (14). libunbound8 is already the newest version (1.24.2-1). libunbound8 set to manually installed. libsasl2-2 is already the newest version (2.1.28+dfsg1-10). libsasl2-2 set to manually installed. build-essential is already the newest version (12.12). build-essential set to manually installed. libpam-modules-bin is already the newest version (1.7.0-5+b1). libsmartcols1 is already the newest version (2.41.3-3). dh-fortran is already the newest version (0.63). dh-fortran set to manually installed. libnl-3-200 is already the newest version (3.12.0-2). libnl-3-200 set to manually installed. libc-bin is already the newest version (2.42-10+b1). libparpack2-dev is already the newest version (3.9.1-6+b1). libparpack2-dev set to manually installed. zlib1g is already the newest version (1:1.3.dfsg+really1.3.1-1+b2). openmpi-common is already the newest version (5.0.9-1). openmpi-common set to manually installed. libgprofng0 is already the newest version (2.45.50.20260119-1). libgprofng0 set to manually installed. libscotch-dev is already the newest version (7.0.10-7). libscotch-dev set to manually installed. sed is already the newest version (4.9-2). libbinutils is already the newest version (2.45.50.20260119-1). libbinutils set to manually installed. base-passwd is already the newest version (3.6.8). libibverbs1 is already the newest version (61.0-2). libibverbs1 set to manually installed. libisl23 is already the newest version (0.27-1+b1). libisl23 set to manually installed. libblas64-3 is already the newest version (3.12.1-7+b1). libblas64-3 set to manually installed. libmpc3 is already the newest version (1.3.1-2+b1). libmpc3 set to manually installed. libnl-route-3-200 is already the newest version (3.12.0-2). libnl-route-3-200 set to manually installed. libsuperlu-dist9 is already the newest version (9.2.1+dfsg1-1). libsuperlu-dist9 set to manually installed. libnl-3-dev is already the newest version (3.12.0-2). libnl-3-dev set to manually installed. libkadm5srv-mit12 is already the newest version (1.22.1-2). libkadm5srv-mit12 set to manually installed. login.defs is already the newest version (1:4.19.0-4). login.defs set to manually installed. x11proto-dev is already the newest version (2024.1-1). x11proto-dev set to manually installed. intltool-debian is already the newest version (0.35.0+20060710.6). intltool-debian set to manually installed. openssh-client is already the newest version (1:10.2p1-3). openssh-client set to manually installed. libubsan1 is already the newest version (15.2.0-12). libubsan1 set to manually installed. libcrypt1 is already the newest version (1:4.5.1-1). gcc-15-riscv64-linux-gnu is already the newest version (15.2.0-12). gcc-15-riscv64-linux-gnu set to manually installed. libmpfr6 is already the newest version (4.2.2-2+b1). libmpfr6 set to manually installed. libaec-dev is already the newest version (1.1.4-2+b1). libaec-dev set to manually installed. opencl-c-headers is already the newest version (3.0~2025.07.22-2). opencl-c-headers set to manually installed. libscalapack-openmpi-dev is already the newest version (2.2.2-5). libscalapack-openmpi-dev set to manually installed. gcc-riscv64-linux-gnu is already the newest version (4:15.2.0-5). gcc-riscv64-linux-gnu set to manually installed. libptscotch-64-dev is already the newest version (7.0.10-7). libptscotch-64-dev set to manually installed. libhdf5-openmpi-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-310 set to manually installed. libdebconfclient0 is already the newest version (0.282+b2). python3-click is already the newest version (8.2.0+0.really.8.1.8-1). python3-click set to manually installed. libscotch-7.0c is already the newest version (7.0.10-7). libscotch-7.0c set to manually installed. perl is already the newest version (5.40.1-7). perl set to manually installed. libcap2 is already the newest version (1:2.75-10+b5). sensible-utils is already the newest version (0.0.26). sensible-utils set to manually installed. libscotch-64-7.0 is already the newest version (7.0.10-7). libscotch-64-7.0 set to manually installed. libarpack2t64 is already the newest version (3.9.1-6+b1). libarpack2t64 set to manually installed. libc-dev-bin is already the newest version (2.42-10+b1). libc-dev-bin set to manually installed. libscotcherr-dev is already the newest version (7.0.10-7). libscotcherr-dev set to manually installed. pkgconf is already the newest version (1.8.1-4+b1). pkgconf set to manually installed. libbsd0 is already the newest version (0.12.2-2+b1). libbsd0 set to manually installed. libltdl7 is already the newest version (2.5.4-9). libltdl7 set to manually installed. python3 is already the newest version (3.13.9-3). python3 set to manually installed. libhwloc-plugins is already the newest version (2.12.2-1+b1). libhwloc-plugins set to manually installed. libunistring5 is already the newest version (1.3-2+b1). libunistring5 set to manually installed. libc-gconv-modules-extra is already the newest version (2.42-10+b1). libjansson4 is already the newest version (2.14-2+b4). libjansson4 set to manually installed. libgfortran5 is already the newest version (15.2.0-12). libgfortran5 set to manually installed. cpp is already the newest version (4:15.2.0-5). cpp set to manually installed. libptscotch-dev is already the newest version (7.0.10-7). libptscotch-dev set to manually installed. libssh2-1-dev is already the newest version (1.11.1-1+b1). libssh2-1-dev set to manually installed. libtool is already the newest version (2.5.4-9). libtool set to manually installed. ncurses-base is already the newest version (6.6+20251231-1). patchelf is already the newest version (0.18.0-1.4). patchelf set to manually installed. perl-base is already the newest version (5.40.1-7). binutils is already the newest version (2.45.50.20260119-1). binutils set to manually installed. libscotcherr-7.0 is already the newest version (7.0.10-7). libscotcherr-7.0 set to manually installed. libgnutls-openssl27t64 is already the newest version (3.8.11-3). libgnutls-openssl27t64 set to manually installed. libfftw3-dev is already the newest version (3.3.10-2+b2). libfftw3-dev set to manually installed. libparu1 is already the newest version (1:7.12.1+dfsg-1). libparu1 set to manually installed. libpsl5t64 is already the newest version (0.21.2-1.1+b2). libpsl5t64 set to manually installed. libudev1 is already the newest version (259-1). libptscotch-64i-dev is already the newest version (7.0.10-7). libptscotch-64i-dev set to manually installed. libfftw3-single3 is already the newest version (3.3.10-2+b2). libfftw3-single3 set to manually installed. bzip2 is already the newest version (1.0.8-6+b1). bzip2 set to manually installed. libtasn1-6 is already the newest version (4.21.0-2). libtasn1-6 set to manually installed. libpmix2t64 is already the newest version (6.0.0+really5.0.9-3). libpmix2t64 set to manually installed. libcurl4-openssl-dev is already the newest version (8.18.0-2). libcurl4-openssl-dev set to manually installed. libpetsc64-real3.24 is already the newest version (3.24.3+dfsg1-1). libpetsc64-real3.24 set to manually installed. libnghttp3-9 is already the newest version (1.12.0-1). libnghttp3-9 set to manually installed. coreutils is already the newest version (9.7-3). patch is already the newest version (2.8-2). patch set to manually installed. g++ is already the newest version (4:15.2.0-5). g++ set to manually installed. libnuma1 is already the newest version (2.0.19-1+b1). libnuma1 set to manually installed. libmumps-64pord-5.8 is already the newest version (5.8.1-2). libmumps-64pord-5.8 set to manually installed. libp11-kit-dev is already the newest version (0.25.10-1+b1). libp11-kit-dev set to manually installed. libfftw3-mpi-dev is already the newest version (3.3.10-2+b2). libfftw3-mpi-dev set to manually installed. libscotch-64-dev is already the newest version (7.0.10-7). libscotch-64-dev set to manually installed. libhwloc15 is already the newest version (2.12.2-1+b1). libhwloc15 set to manually installed. libffi8 is already the newest version (3.5.2-3+b1). libffi8 set to manually installed. readline-common is already the newest version (8.3-3). readline-common set to manually installed. grep is already the newest version (3.12-1). libncursesw6 is already the newest version (6.6+20251231-1). libncursesw6 set to manually installed. ibverbs-providers is already the newest version (61.0-2). ibverbs-providers set to manually installed. libkadm5clnt-mit12 is already the newest version (1.22.1-2). libkadm5clnt-mit12 set to manually installed. python3-minimal is already the newest version (3.13.9-3). python3-minimal set to manually installed. libhdf5-openmpi-hl-cpp-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-hl-cpp-310 set to manually installed. libmount1 is already the newest version (2.41.3-3). debianutils is already the newest version (5.23.2). python3.13 is already the newest version (3.13.11-1+b1). python3.13 set to manually installed. libfortran-jonquil-0 is already the newest version (0.3.0-4). libfortran-jonquil-0 set to manually installed. libpetsc-real3.24-dev is already the newest version (3.24.3+dfsg1-1). libpetsc-real3.24-dev set to manually installed. libdb5.3t64 is already the newest version (5.3.28+dfsg2-11). gcc-15 is already the newest version (15.2.0-12). gcc-15 set to manually installed. libtsan2 is already the newest version (15.2.0-12). libtsan2 set to manually installed. libldap2 is already the newest version (2.6.10+dfsg-1+b1). libldap2 set to manually installed. libgcc-15-dev is already the newest version (15.2.0-12). libgcc-15-dev set to manually installed. liblapack-dev is already the newest version (3.12.1-7+b1). liblapack-dev set to manually installed. ocl-icd-libopencl1 is already the newest version (2.3.4-1). ocl-icd-libopencl1 set to manually installed. libspqr4 is already the newest version (1:7.12.1+dfsg-1). libspqr4 set to manually installed. libcolamd3 is already the newest version (1:7.12.1+dfsg-1). libcolamd3 set to manually installed. libopenblas64-0 is already the newest version (0.3.30+ds-3+b1). libopenblas64-0 set to manually installed. libopenblas64-pthread-dev is already the newest version (0.3.30+ds-3+b1). libopenblas64-pthread-dev set to manually installed. findutils is already the newest version (4.10.0-3). libopenmpi-dev is already the newest version (5.0.9-1). libopenmpi-dev set to manually installed. libmumps-headers-dev is already the newest version (5.8.1-2). libmumps-headers-dev set to manually installed. netbase is already the newest version (6.5). netbase set to manually installed. libxext6 is already the newest version (2:1.3.4-1+b4). libxext6 set to manually installed. libjs-jquery is already the newest version (3.7.1+dfsg+~3.5.33-1). libjs-jquery set to manually installed. libgssrpc4t64 is already the newest version (1.22.1-2). libgssrpc4t64 set to manually installed. libc6 is already the newest version (2.42-10+b1). libstdc++6 is already the newest version (15.2.0-12). libstdc++6 set to manually installed. libevent-2.1-7t64 is already the newest version (2.1.12-stable-10+b2). libevent-2.1-7t64 set to manually installed. libsuitesparseconfig7 is already the newest version (1:7.12.1+dfsg-1). libsuitesparseconfig7 set to manually installed. libedit2 is already the newest version (3.1-20251016-1). libedit2 set to manually installed. libx11-dev is already the newest version (2:1.8.12-1+b1). libx11-dev set to manually installed. libsqlite3-0 is already the newest version (3.46.1-9). libsqlite3-0 set to manually installed. libgomp1 is already the newest version (15.2.0-12). libgomp1 set to manually installed. libfftw3-long3 is already the newest version (3.3.10-2+b2). libfftw3-long3 set to manually installed. libmetis5 is already the newest version (5.1.0.dfsg-8). libmetis5 set to manually installed. libhwloc-dev is already the newest version (2.12.2-1+b1). libhwloc-dev set to manually installed. libmagic-mgc is already the newest version (1:5.46-5+b1). libmagic-mgc set to manually installed. make is already the newest version (4.4.1-3). make set to manually installed. libsemanage-common is already the newest version (3.9-1). libsemanage-common set to manually installed. gfortran-15 is already the newest version (15.2.0-12). gfortran-15 set to manually installed. libfile-stripnondeterminism-perl is already the newest version (1.15.0-1). libfile-stripnondeterminism-perl set to manually installed. libpsl-dev is already the newest version (0.21.2-1.1+b2). libpsl-dev set to manually installed. binutils-common is already the newest version (2.45.50.20260119-1). binutils-common set to manually installed. libreadline8t64 is already the newest version (8.3-3+b1). libreadline8t64 set to manually installed. linux-libc-dev is already the newest version (6.18.5-1). linux-libc-dev set to manually installed. libsasl2-modules-db is already the newest version (2.1.28+dfsg1-10). libsasl2-modules-db set to manually installed. libcc1-0 is already the newest version (15.2.0-12). libcc1-0 set to manually installed. gzip is already the newest version (1.13-1). libhdf5-openmpi-dev is already the newest version (1.14.6+repack-2). libhdf5-openmpi-dev set to manually installed. libpcre2-8-0 is already the newest version (10.46-1+b1). libblkid1 is already the newest version (2.41.3-3). mpi-default-dev is already the newest version (1.20). mpi-default-dev set to manually installed. libnghttp2-14 is already the newest version (1.64.0-1.1+b1). libnghttp2-14 set to manually installed. libpam-runtime is already the newest version (1.7.0-5). libblas3 is already the newest version (3.12.1-7+b1). libblas3 set to manually installed. tzdata is already the newest version (2025c-3). tzdata set to manually installed. libsuperlu-dev is already the newest version (7.0.1+dfsg1-2+b1). libsuperlu-dev set to manually installed. libamd3 is already the newest version (1:7.12.1+dfsg-1). libamd3 set to manually installed. libevent-extra-2.1-7t64 is already the newest version (2.1.12-stable-10+b2). libevent-extra-2.1-7t64 set to manually installed. librtmp-dev is already the newest version (2.4+20151223.gitfa8646d.1-3+b1). librtmp-dev set to manually installed. libpetsc-complex3.24 is already the newest version (3.24.3+dfsg1-1). libpetsc-complex3.24 set to manually installed. libevent-core-2.1-7t64 is already the newest version (2.1.12-stable-10+b2). libevent-core-2.1-7t64 set to manually installed. libsframe3 is already the newest version (2.45.50.20260119-1). libsframe3 set to manually installed. libmd0 is already the newest version (1.1.0-2+b2). nettle-dev is already the newest version (3.10.2-1). nettle-dev set to manually installed. libccolamd3 is already the newest version (1:7.12.1+dfsg-1). libccolamd3 set to manually installed. libngtcp2-16 is already the newest version (1.16.0-1). libngtcp2-16 set to manually installed. libnettle8t64 is already the newest version (3.10.2-1). libnettle8t64 set to manually installed. libfile-libmagic-perl is already the newest version (1.23-2+b2). libfile-libmagic-perl set to manually installed. libfido2-1 is already the newest version (1.16.0-2+b1). libfido2-1 set to manually installed. passwd is already the newest version (1:4.19.0-4). passwd set to manually installed. libcap-ng0 is already the newest version (0.8.5-4+b2). libhdf5-mpi-dev is already the newest version (1.14.6+repack-2). libhdf5-mpi-dev set to manually installed. libjpeg62-turbo is already the newest version (1:2.1.5-4). libjpeg62-turbo set to manually installed. libibmad5 is already the newest version (61.0-2). libibmad5 set to manually installed. libexpat1 is already the newest version (2.7.3-2). libexpat1 set to manually installed. dwz is already the newest version (0.16-2). dwz set to manually installed. automake is already the newest version (1:1.18.1-3). automake set to manually installed. krb5-multidev is already the newest version (1.22.1-2). krb5-multidev set to manually installed. libldl3 is already the newest version (1:7.12.1+dfsg-1). libldl3 set to manually installed. g++-15 is already the newest version (15.2.0-12). g++-15 set to manually installed. libjpeg62-turbo-dev is already the newest version (1:2.1.5-4). libjpeg62-turbo-dev set to manually installed. util-linux is already the newest version (2.41.3-3). libtasn1-6-dev is already the newest version (4.21.0-2). libtasn1-6-dev set to manually installed. python3.13-minimal is already the newest version (3.13.11-1+b1). python3.13-minimal set to manually installed. libhogweed6t64 is already the newest version (3.10.2-1). libhogweed6t64 set to manually installed. libmumps-dev is already the newest version (5.8.1-2). libmumps-dev set to manually installed. media-types is already the newest version (14.0.0). media-types set to manually installed. liblapack3 is already the newest version (3.12.1-7+b1). liblapack3 set to manually installed. g++-riscv64-linux-gnu is already the newest version (4:15.2.0-5). g++-riscv64-linux-gnu set to manually installed. liblapack64-3 is already the newest version (3.12.1-7+b1). liblapack64-3 set to manually installed. mpi-default-bin is already the newest version (1.20). mpi-default-bin set to manually installed. libgfortran-15-dev is already the newest version (15.2.0-12). libgfortran-15-dev set to manually installed. libsepol2 is already the newest version (3.9-2). libsepol2 set to manually installed. libibverbs-dev is already the newest version (61.0-2). libibverbs-dev set to manually installed. libssh2-1t64 is already the newest version (1.11.1-1+b1). libssh2-1t64 set to manually installed. libscotch-64i-7.0 is already the newest version (7.0.10-7). libscotch-64i-7.0 set to manually installed. libctf-nobfd0 is already the newest version (2.45.50.20260119-1). libctf-nobfd0 set to manually installed. libelf1t64 is already the newest version (0.194-1). libelf1t64 set to manually installed. liblzma5 is already the newest version (5.8.2-2). cpp-15 is already the newest version (15.2.0-12). cpp-15 set to manually installed. libsuitesparse-mongoose3 is already the newest version (1:7.12.1+dfsg-1). libsuitesparse-mongoose3 set to manually installed. libasan8 is already the newest version (15.2.0-12). libasan8 set to manually installed. groff-base is already the newest version (1.23.0-10). groff-base set to manually installed. libnuma-dev is already the newest version (2.0.19-1+b1). libnuma-dev set to manually installed. libpetsc-complex3.24-dev is already the newest version (3.24.3+dfsg1-1). libpetsc-complex3.24-dev set to manually installed. xtrans-dev is already the newest version (1.6.0-1). xtrans-dev set to manually installed. libitm1 is already the newest version (15.2.0-12). libitm1 set to manually installed. libbz2-1.0 is already the newest version (1.0.8-6+b1). libhdf5-openmpi-hl-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-hl-310 set to manually installed. libgmp-dev is already the newest version (2:6.3.0+dfsg-5+b1). libgmp-dev set to manually installed. adduser is already the newest version (3.154). adduser set to manually installed. libfftw3-bin is already the newest version (3.3.10-2+b2). libfftw3-bin set to manually installed. libctf0 is already the newest version (2.45.50.20260119-1). libctf0 set to manually installed. libsemanage2 is already the newest version (3.9-1+b1). libsemanage2 set to manually installed. m4 is already the newest version (1.4.20-2). m4 set to manually installed. libacl1 is already the newest version (2.3.2-2+b2). libp11-kit0 is already the newest version (0.25.10-1+b1). libp11-kit0 set to manually installed. libptscotch-7.0c is already the newest version (7.0.10-7). libptscotch-7.0c set to manually installed. libgnutls28-dev is already the newest version (3.8.11-3). libgnutls28-dev set to manually installed. libbrotli1 is already the newest version (1.1.0-2+b9). libbrotli1 set to manually installed. libgnutls30t64 is already the newest version (3.8.11-3). libgnutls30t64 set to manually installed. libuchardet0 is already the newest version (0.0.8-2+b1). libuchardet0 set to manually installed. xz-utils is already the newest version (5.8.2-2). xz-utils set to manually installed. libdpkg-perl is already the newest version (1.23.5). libdpkg-perl set to manually installed. libuuid1 is already the newest version (2.41.3-3). libdebhelper-perl is already the newest version (13.29). libdebhelper-perl set to manually installed. libpython3.13-stdlib is already the newest version (3.13.11-1+b1). libpython3.13-stdlib set to manually installed. binutils-riscv64-linux-gnu is already the newest version (2.45.50.20260119-1). binutils-riscv64-linux-gnu set to manually installed. libngtcp2-crypto-ossl-dev is already the newest version (1.16.0-1). libngtcp2-crypto-ossl-dev set to manually installed. libpython3-stdlib is already the newest version (3.13.9-3). libpython3-stdlib set to manually installed. libzstd-dev is already the newest version (1.5.7+dfsg-3). libzstd-dev set to manually installed. cpp-15-riscv64-linux-gnu is already the newest version (15.2.0-12). cpp-15-riscv64-linux-gnu set to manually installed. man-db is already the newest version (2.13.1-1). man-db set to manually installed. libptscotcherr-dev is already the newest version (7.0.10-7). libptscotcherr-dev set to manually installed. libfabric1 is already the newest version (2.1.0-1.1+b1). libfabric1 set to manually installed. libhdf5-openmpi-fortran-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-fortran-310 set to manually installed. libjs-jquery-ui is already the newest version (1.13.2+dfsg-1). libjs-jquery-ui set to manually installed. debhelper is already the newest version (13.29). debhelper set to manually installed. libgdbm-compat4t64 is already the newest version (1.26-1+b1). libgdbm-compat4t64 set to manually installed. libsuperlu7 is already the newest version (7.0.1+dfsg1-2+b1). libsuperlu7 set to manually installed. libsystemd0 is already the newest version (259-1). libptscotcherr-7.0 is already the newest version (7.0.10-7). libptscotcherr-7.0 set to manually installed. libfftw3-mpi3 is already the newest version (3.3.10-2+b2). libfftw3-mpi3 set to manually installed. autotools-dev is already the newest version (20240727.1). autotools-dev set to manually installed. libtinfo6 is already the newest version (6.6+20251231-1). libpetsc64-complex3.24-dev is already the newest version (3.24.3+dfsg1-1). libpetsc64-complex3.24-dev set to manually installed. libxml2-16 is already the newest version (2.15.1+dfsg-2+b1). libxml2-16 set to manually installed. libcholmod5 is already the newest version (1:7.12.1+dfsg-1). libcholmod5 set to manually installed. autopoint is already the newest version (0.23.2-1). autopoint set to manually installed. dh-strip-nondeterminism is already the newest version (1.15.0-1). dh-strip-nondeterminism set to manually installed. rpcsvc-proto is already the newest version (1.4.3-1+b2). rpcsvc-proto set to manually installed. libidn2-dev is already the newest version (2.3.8-4+b1). libidn2-dev set to manually installed. libsuitesparse-dev is already the newest version (1:7.12.1+dfsg-1). libsuitesparse-dev set to manually installed. libldap-dev is already the newest version (2.6.10+dfsg-1+b1). libldap-dev set to manually installed. libkrb5-3 is already the newest version (1.22.1-2). libkrb5-3 set to manually installed. file is already the newest version (1:5.46-5+b1). file set to manually installed. libngtcp2-crypto-ossl0 is already the newest version (1.16.0-1). libngtcp2-crypto-ossl0 set to manually installed. dpkg-dev is already the newest version (1.23.5). dpkg-dev set to manually installed. init-system-helpers is already the newest version (1.69). libsuperlu-dist-dev is already the newest version (9.2.1+dfsg1-1). libsuperlu-dist-dev set to manually installed. libxcb1-dev is already the newest version (1.17.0-2+b2). libxcb1-dev set to manually installed. chrpath is already the newest version (0.18-1). chrpath set to manually installed. ncurses-bin is already the newest version (6.6+20251231-1). python3-magic is already the newest version (2:0.4.27-3). python3-magic set to manually installed. libibumad3 is already the newest version (61.0-2). libibumad3 set to manually installed. libpetsc64-complex3.24 is already the newest version (3.24.3+dfsg1-1). libpetsc64-complex3.24 set to manually installed. librbio4 is already the newest version (1:7.12.1+dfsg-1). librbio4 set to manually installed. libgssapi-krb5-2 is already the newest version (1.22.1-2). libgssapi-krb5-2 set to manually installed. libaudit-common is already the newest version (1:4.1.2-1). bsdextrautils is already the newest version (2.41.3-3). bsdextrautils set to manually installed. po-debconf is already the newest version (1.0.22). po-debconf set to manually installed. xorg-sgml-doctools is already the newest version (1:1.11-1.1). xorg-sgml-doctools set to manually installed. libaudit1 is already the newest version (1:4.1.2-1+b1). gettext-base is already the newest version (0.23.2-1). gettext-base set to manually installed. libgnutls-dane0t64 is already the newest version (3.8.11-3). libgnutls-dane0t64 set to manually installed. sysvinit-utils is already the newest version (3.15-6). libperl5.40 is already the newest version (5.40.1-7). libperl5.40 set to manually installed. libgcc-s1 is already the newest version (15.2.0-12). libidn2-0 is already the newest version (2.3.8-4+b1). libidn2-0 set to manually installed. libpetsc3.24-dev-examples is already the newest version (3.24.3+dfsg1-1). libpetsc3.24-dev-examples set to manually installed. libnl-route-3-dev is already the newest version (3.12.0-2). libnl-route-3-dev set to manually installed. libopenblas64-dev is already the newest version (0.3.30+ds-3+b1). libopenblas64-dev set to manually installed. cpp-riscv64-linux-gnu is already the newest version (4:15.2.0-5). cpp-riscv64-linux-gnu set to manually installed. libjs-mathjax is already the newest version (2.7.9+dfsg-1). libjs-mathjax set to manually installed. libevent-pthreads-2.1-7t64 is already the newest version (2.1.12-stable-10+b2). libevent-pthreads-2.1-7t64 set to manually installed. libx11-data is already the newest version (2:1.8.12-1). libx11-data set to manually installed. perl-modules-5.40 is already the newest version (5.40.1-7). perl-modules-5.40 set to manually installed. gcc is already the newest version (4:15.2.0-5). gcc set to manually installed. libscotch-64i-dev is already the newest version (7.0.10-7). libscotch-64i-dev set to manually installed. libmagic1t64 is already the newest version (1:5.46-5+b1). libmagic1t64 set to manually installed. gfortran is already the newest version (4:15.2.0-5). gfortran set to manually installed. libcombblas2.0.0t64 is already the newest version (2.0.0-7). libcombblas2.0.0t64 set to manually installed. libpython3.13-minimal is already the newest version (3.13.11-1+b1). libpython3.13-minimal set to manually installed. gfortran-riscv64-linux-gnu is already the newest version (4:15.2.0-5). gfortran-riscv64-linux-gnu set to manually installed. libnghttp3-dev is already the newest version (1.12.0-1). libnghttp3-dev set to manually installed. tar is already the newest version (1.35+dfsg-3.1). libmumps-5.8 is already the newest version (5.8.1-2). libmumps-5.8 set to manually installed. libpciaccess0 is already the newest version (0.17-3+b4). libpciaccess0 set to manually installed. libaec0 is already the newest version (1.1.4-2+b1). libaec0 set to manually installed. ocl-icd-opencl-dev is already the newest version (2.3.4-1). ocl-icd-opencl-dev set to manually installed. openmpi-bin is already the newest version (5.0.9-1). openmpi-bin set to manually installed. libarchive-zip-perl is already the newest version (1.68-1). libarchive-zip-perl set to manually installed. libyaml-0-2 is already the newest version (0.2.5-2+b1). libyaml-0-2 set to manually installed. libsz2 is already the newest version (1.1.4-2+b1). libsz2 set to manually installed. pkgconf-bin is already the newest version (1.8.1-4+b1). pkgconf-bin set to manually installed. dpkg is already the newest version (1.23.5). libpam-modules is already the newest version (1.7.0-5+b1). libfuse3-4 is already the newest version (3.18.1-1). libfuse3-4 set to manually installed. g++-15-riscv64-linux-gnu is already the newest version (15.2.0-12). g++-15-riscv64-linux-gnu set to manually installed. liblsan0 is already the newest version (15.2.0-12). liblsan0 set to manually installed. libevent-dev is already the newest version (2.1.12-stable-10+b2). libevent-dev set to manually installed. gettext is already the newest version (0.23.2-1). gettext set to manually installed. libstdc++-15-dev is already the newest version (15.2.0-12). libstdc++-15-dev set to manually installed. libc6-dev is already the newest version (2.42-10+b1). libc6-dev set to manually installed. libscalapack-mpi-dev is already the newest version (2.2.2-5). libscalapack-mpi-dev set to manually installed. libssl-dev is already the newest version (3.5.4-1+b1). libssl-dev set to manually installed. libumfpack6 is already the newest version (1:7.12.1+dfsg-1). libumfpack6 set to manually installed. librdmacm1t64 is already the newest version (61.0-2). librdmacm1t64 set to manually installed. libkeyutils1 is already the newest version (1.6.3-6+b1). libkeyutils1 set to manually installed. libx11-6 is already the newest version (2:1.8.12-1+b1). libx11-6 set to manually installed. autoconf is already the newest version (2.72-3.1). autoconf set to manually installed. libhypre-3.0.0 is already the newest version (3.0.0-5). libhypre-3.0.0 set to manually installed. libpam0g is already the newest version (1.7.0-5+b1). libspex3 is already the newest version (1:7.12.1+dfsg-1). libspex3 set to manually installed. diffutils is already the newest version (1:3.12-1). libselinux1 is already the newest version (3.9-4+b1). libevent-openssl-2.1-7t64 is already the newest version (2.1.12-stable-10+b2). libevent-openssl-2.1-7t64 set to manually installed. hostname is already the newest version (3.25). libkrb5-dev is already the newest version (1.22.1-2). libkrb5-dev set to manually installed. libhypre64-3.0.0 is already the newest version (3.0.0-5). libhypre64-3.0.0 set to manually installed. libhdf5-openmpi-hl-fortran-310 is already the newest version (1.14.6+repack-2). libhdf5-openmpi-hl-fortran-310 set to manually installed. libpetsc64-real3.24-dev is already the newest version (3.24.3+dfsg1-1). libpetsc64-real3.24-dev set to manually installed. bash is already the newest version (5.3-1). 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. I: running --customize-hook in shell: sh -c 'chroot "$1" dpkg -r debootsnap-dummy' exec /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW (Reading database ... 35026 files and directories currently installed.) Removing debootsnap-dummy (1.0) ... I: running --customize-hook in shell: sh -c 'chroot "$1" dpkg-query --showformat '${binary:Package}=${Version}\n' --show > "$1/pkglist"' exec /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW I: running special hook: download /pkglist ./pkglist I: running --customize-hook in shell: sh -c 'rm "$1/pkglist"' exec /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW I: running special hook: upload sources.list /etc/apt/sources.list I: waiting for background processes to finish... I: cleaning package lists and apt cache... I: skipping cleanup/reproducible as requested I: creating tarball... I: done I: removing tempdir /srv/rebuilderd/tmp/mmdebstrap.bXeUMqghaW... I: success in 291.4584 seconds Downloading dependency 365 of 392: dpkg:riscv64=1.23.5 Downloading dependency 366 of 392: libpam-modules:riscv64=1.7.0-5+b1 Downloading dependency 367 of 392: libfuse3-4:riscv64=3.18.1-1 Downloading dependency 368 of 392: g++-15-riscv64-linux-gnu:riscv64=15.2.0-12 Downloading dependency 369 of 392: liblsan0:riscv64=15.2.0-12 Downloading dependency 370 of 392: libevent-dev:riscv64=2.1.12-stable-10+b2 Downloading dependency 371 of 392: gettext:riscv64=0.23.2-1 Downloading dependency 372 of 392: libstdc++-15-dev:riscv64=15.2.0-12 Downloading dependency 373 of 392: libc6-dev:riscv64=2.42-10+b1 Downloading dependency 374 of 392: libscalapack-mpi-dev:riscv64=2.2.2-5 Downloading dependency 375 of 392: libssl-dev:riscv64=3.5.4-1+b1 Downloading dependency 376 of 392: libumfpack6:riscv64=1:7.12.1+dfsg-1 Downloading dependency 377 of 392: librdmacm1t64:riscv64=61.0-2 Downloading dependency 378 of 392: libkeyutils1:riscv64=1.6.3-6+b1 Downloading dependency 379 of 392: libx11-6:riscv64=2:1.8.12-1+b1 Downloading dependency 380 of 392: autoconf:riscv64=2.72-3.1 Downloading dependency 381 of 392: libhypre-3.0.0:riscv64=3.0.0-5 Downloading dependency 382 of 392: libpam0g:riscv64=1.7.0-5+b1 Downloading dependency 383 of 392: libspex3:riscv64=1:7.12.1+dfsg-1 Downloading dependency 384 of 392: diffutils:riscv64=1:3.12-1 Downloading dependency 385 of 392: libselinux1:riscv64=3.9-4+b1 Downloading dependency 386 of 392: libevent-openssl-2.1-7t64:riscv64=2.1.12-stable-10+b2 Downloading dependency 387 of 392: hostname:riscv64=3.25 Downloading dependency 388 of 392: libkrb5-dev:riscv64=1.22.1-2 Downloading dependency 389 of 392: libhypre64-3.0.0:riscv64=3.0.0-5 Downloading dependency 390 of 392: libhdf5-openmpi-hl-fortran-310:riscv64=1.14.6+repack-2 Downloading dependency 391 of 392: libpetsc64-real3.24-dev:riscv64=3.24.3+dfsg1-1 Downloading dependency 392 of 392: bash:riscv64=5.3-1 env --chdir=/srv/rebuilderd/tmp/rebuilderdnZCy1O/out DEB_BUILD_OPTIONS=parallel=4 LANG=C.UTF-8 LC_COLLATE=C.UTF-8 LC_CTYPE=C.UTF-8 SOURCE_DATE_EPOCH=1769366661 SBUILD_CONFIG=/srv/rebuilderd/tmp/debrebuildpdW6pr/debrebuild.sbuildrc.S6zyKpQRBUu5 sbuild --build=riscv64 --host=riscv64 --no-source --arch-any --no-arch-all --chroot=/srv/rebuilderd/tmp/debrebuildpdW6pr/debrebuild.tar.u74MjYwEkaKr --chroot-mode=unshare --dist=unstable --no-run-lintian --no-run-piuparts --no-run-autopkgtest --no-apt-update --no-apt-upgrade --no-apt-distupgrade --verbose --nolog --bd-uninstallable-explainer= --build-path=/build/reproducible-path --dsc-dir=slepc-3.24.2+dfsg1 /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1.dsc I: consider moving your ~/.sbuildrc to /srv/rebuilderd/.config/sbuild/config.pl The Debian buildds switched to the "unshare" backend and sbuild will default to it in the future. To start using "unshare" add this to your `~/.config/sbuild/config.pl`: $chroot_mode = "unshare"; If you want to keep the old "schroot" mode even in the future, add the following to your `~/.config/sbuild/config.pl`: $chroot_mode = "schroot"; $schroot = "schroot"; sbuild (Debian sbuild) 0.89.3+deb13u4 (28 December 2025) on localhost +==============================================================================+ | slepc 3.24.2+dfsg1-1 (riscv64) Sat, 31 Jan 2026 13:01:57 +0000 | +==============================================================================+ Package: slepc Version: 3.24.2+dfsg1-1 Source Version: 3.24.2+dfsg1-1 Distribution: unstable Machine Architecture: riscv64 Host Architecture: riscv64 Build Architecture: riscv64 Build Type: any I: No tarballs found in /srv/rebuilderd/.cache/sbuild I: Unpacking /srv/rebuilderd/tmp/debrebuildpdW6pr/debrebuild.tar.u74MjYwEkaKr to /srv/rebuilderd/tmp/tmp.sbuild.TMOPfegcAU... I: Setting up the chroot... I: Creating chroot session... I: Setting up log color... I: Setting up apt archive... +------------------------------------------------------------------------------+ | Fetch source files Sat, 31 Jan 2026 13:02:50 +0000 | +------------------------------------------------------------------------------+ Local sources ------------- /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1.dsc exists in /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs; copying to chroot +------------------------------------------------------------------------------+ | Install package build dependencies Sat, 31 Jan 2026 13:02:55 +0000 | +------------------------------------------------------------------------------+ Setup apt archive ----------------- Merged Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev, build-essential Filtered Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev, build-essential dpkg-deb: building package 'sbuild-build-depends-main-dummy' in '/build/reproducible-path/resolver-MzdKYS/apt_archive/sbuild-build-depends-main-dummy.deb'. Install main build dependencies (apt-based resolver) ---------------------------------------------------- Installing build dependencies +------------------------------------------------------------------------------+ | Check architectures Sat, 31 Jan 2026 13:03:13 +0000 | +------------------------------------------------------------------------------+ Arch check ok (riscv64 included in any all) +------------------------------------------------------------------------------+ | Build environment Sat, 31 Jan 2026 13:03:14 +0000 | +------------------------------------------------------------------------------+ Kernel: Linux 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 (riscv64) Toolchain package versions: binutils_2.45.50.20260119-1 dpkg-dev_1.23.5 g++-15_15.2.0-12 gcc-15_15.2.0-12 libc6-dev_2.42-10+b1 libstdc++-15-dev_15.2.0-12 libstdc++6_15.2.0-12 linux-libc-dev_6.18.5-1 Package versions: adduser_3.154 autoconf_2.72-3.1 automake_1:1.18.1-3 autopoint_0.23.2-1 autotools-dev_20240727.1 base-files_14 base-passwd_3.6.8 bash_5.3-1 binutils_2.45.50.20260119-1 binutils-common_2.45.50.20260119-1 binutils-riscv64-linux-gnu_2.45.50.20260119-1 bsdextrautils_2.41.3-3 build-essential_12.12 bzip2_1.0.8-6+b1 chrpath_0.18-1 comerr-dev_2.1-1.47.2-3+b8 coreutils_9.7-3 cpp_4:15.2.0-5 cpp-15_15.2.0-12 cpp-15-riscv64-linux-gnu_15.2.0-12 cpp-riscv64-linux-gnu_4:15.2.0-5 dash_0.5.12-12 debconf_1.5.91 debhelper_13.29 debianutils_5.23.2 dh-autoreconf_21 dh-fortran_0.63 dh-python_7.20260125 dh-strip-nondeterminism_1.15.0-1 diffutils_1:3.12-1 dpkg_1.23.5 dpkg-dev_1.23.5 dwz_0.16-2 file_1:5.46-5+b1 findutils_4.10.0-3 fonts-mathjax_2.7.9+dfsg-1 fortran-fpm_0.12.0-6 g++_4:15.2.0-5 g++-15_15.2.0-12 g++-15-riscv64-linux-gnu_15.2.0-12 g++-riscv64-linux-gnu_4:15.2.0-5 gcc_4:15.2.0-5 gcc-15_15.2.0-12 gcc-15-base_15.2.0-12 gcc-15-riscv64-linux-gnu_15.2.0-12 gcc-riscv64-linux-gnu_4:15.2.0-5 gettext_0.23.2-1 gettext-base_0.23.2-1 gfortran_4:15.2.0-5 gfortran-15_15.2.0-12 gfortran-15-riscv64-linux-gnu_15.2.0-12 gfortran-riscv64-linux-gnu_4:15.2.0-5 grep_3.12-1 groff-base_1.23.0-10 gzip_1.13-1 hostname_3.25 ibverbs-providers_61.0-2 init-system-helpers_1.69 intltool-debian_0.35.0+20060710.6 krb5-multidev_1.22.1-2 libacl1_2.3.2-2+b2 libaec-dev_1.1.4-2+b1 libaec0_1.1.4-2+b1 libamd3_1:7.12.1+dfsg-1 libarchive-zip-perl_1.68-1 libarpack2-dev_3.9.1-6+b1 libarpack2t64_3.9.1-6+b1 libasan8_15.2.0-12 libatomic1_15.2.0-12 libattr1_1:2.5.2-3+b1 libaudit-common_1:4.1.2-1 libaudit1_1:4.1.2-1+b1 libbinutils_2.45.50.20260119-1 libblas-dev_3.12.1-7+b1 libblas3_3.12.1-7+b1 libblas64-3_3.12.1-7+b1 libblkid1_2.41.3-3 libbrotli-dev_1.1.0-2+b9 libbrotli1_1.1.0-2+b9 libbsd0_0.12.2-2+b1 libbtf2_1:7.12.1+dfsg-1 libbz2-1.0_1.0.8-6+b1 libc-bin_2.42-10+b1 libc-dev-bin_2.42-10+b1 libc-gconv-modules-extra_2.42-10+b1 libc6_2.42-10+b1 libc6-dev_2.42-10+b1 libcamd3_1:7.12.1+dfsg-1 libcap-ng0_0.8.5-4+b2 libcap2_1:2.75-10+b5 libcbor0.10_0.10.2-2.1 libcc1-0_15.2.0-12 libccolamd3_1:7.12.1+dfsg-1 libcholmod5_1:7.12.1+dfsg-1 libcolamd3_1:7.12.1+dfsg-1 libcom-err2_1.47.2-3+b8 libcombblas2.0.0t64_2.0.0-7 libcrypt1_1:4.5.1-1 libctf-nobfd0_2.45.50.20260119-1 libctf0_2.45.50.20260119-1 libcurl4-openssl-dev_8.18.0-2 libcurl4t64_8.18.0-2 libcxsparse4_1:7.12.1+dfsg-1 libdb5.3t64_5.3.28+dfsg2-11 libdebconfclient0_0.282+b2 libdebhelper-perl_13.29 libdpkg-perl_1.23.5 libedit2_3.1-20251016-1 libelf1t64_0.194-1 libevent-2.1-7t64_2.1.12-stable-10+b2 libevent-core-2.1-7t64_2.1.12-stable-10+b2 libevent-dev_2.1.12-stable-10+b2 libevent-extra-2.1-7t64_2.1.12-stable-10+b2 libevent-openssl-2.1-7t64_2.1.12-stable-10+b2 libevent-pthreads-2.1-7t64_2.1.12-stable-10+b2 libexpat1_2.7.3-2 libfabric1_2.1.0-1.1+b1 libffi8_3.5.2-3+b1 libfftw3-bin_3.3.10-2+b2 libfftw3-dev_3.3.10-2+b2 libfftw3-double3_3.3.10-2+b2 libfftw3-long3_3.3.10-2+b2 libfftw3-mpi-dev_3.3.10-2+b2 libfftw3-mpi3_3.3.10-2+b2 libfftw3-single3_3.3.10-2+b2 libfido2-1_1.16.0-2+b1 libfile-libmagic-perl_1.23-2+b2 libfile-stripnondeterminism-perl_1.15.0-1 libfortran-jonquil-0_0.3.0-4 libfortran-toml-0_0.4.3-2 libfuse3-4_3.18.1-1 libgcc-15-dev_15.2.0-12 libgcc-s1_15.2.0-12 libgdbm-compat4t64_1.26-1+b1 libgdbm6t64_1.26-1+b1 libgfortran-15-dev_15.2.0-12 libgfortran5_15.2.0-12 libgmp-dev_2:6.3.0+dfsg-5+b1 libgmp10_2:6.3.0+dfsg-5+b1 libgmpxx4ldbl_2:6.3.0+dfsg-5+b1 libgnutls-dane0t64_3.8.11-3 libgnutls-openssl27t64_3.8.11-3 libgnutls28-dev_3.8.11-3 libgnutls30t64_3.8.11-3 libgomp1_15.2.0-12 libgprofng0_2.45.50.20260119-1 libgssapi-krb5-2_1.22.1-2 libgssrpc4t64_1.22.1-2 libhdf5-mpi-dev_1.14.6+repack-2 libhdf5-openmpi-310_1.14.6+repack-2 libhdf5-openmpi-cpp-310_1.14.6+repack-2 libhdf5-openmpi-dev_1.14.6+repack-2 libhdf5-openmpi-fortran-310_1.14.6+repack-2 libhdf5-openmpi-hl-310_1.14.6+repack-2 libhdf5-openmpi-hl-cpp-310_1.14.6+repack-2 libhdf5-openmpi-hl-fortran-310_1.14.6+repack-2 libhogweed6t64_3.10.2-1 libhwloc-dev_2.12.2-1+b1 libhwloc-plugins_2.12.2-1+b1 libhwloc15_2.12.2-1+b1 libhypre-3.0.0_3.0.0-5 libhypre-dev_3.0.0-5 libhypre64-3.0.0_3.0.0-5 libhypre64-dev_3.0.0-5 libibmad5_61.0-2 libibumad3_61.0-2 libibverbs-dev_61.0-2 libibverbs1_61.0-2 libidn2-0_2.3.8-4+b1 libidn2-dev_2.3.8-4+b1 libisl23_0.27-1+b1 libitm1_15.2.0-12 libjansson4_2.14-2+b4 libjpeg-dev_1:2.1.5-4 libjpeg62-turbo_1:2.1.5-4 libjpeg62-turbo-dev_1:2.1.5-4 libjs-jquery_3.7.1+dfsg+~3.5.33-1 libjs-jquery-ui_1.13.2+dfsg-1 libjs-mathjax_2.7.9+dfsg-1 libk5crypto3_1.22.1-2 libkadm5clnt-mit12_1.22.1-2 libkadm5srv-mit12_1.22.1-2 libkdb5-10t64_1.22.1-2 libkeyutils1_1.6.3-6+b1 libklu2_1:7.12.1+dfsg-1 libkrb5-3_1.22.1-2 libkrb5-dev_1.22.1-2 libkrb5support0_1.22.1-2 liblapack-dev_3.12.1-7+b1 liblapack3_3.12.1-7+b1 liblapack64-3_3.12.1-7+b1 libldap-dev_2.6.10+dfsg-1+b1 libldap2_2.6.10+dfsg-1+b1 libldl3_1:7.12.1+dfsg-1 liblsan0_15.2.0-12 libltdl-dev_2.5.4-9 libltdl7_2.5.4-9 liblzma5_5.8.2-2 libmagic-mgc_1:5.46-5+b1 libmagic1t64_1:5.46-5+b1 libmd0_1.1.0-2+b2 libmetis5_5.1.0.dfsg-8 libmount1_2.41.3-3 libmpc3_1.3.1-2+b1 libmpfr6_4.2.2-2+b1 libmumps-5.8_5.8.1-2 libmumps-64pord-5.8_5.8.1-2 libmumps-dev_5.8.1-2 libmumps-headers-dev_5.8.1-2 libmumps64-dev_5.8.1-2 libmunge2_0.5.16-1+b1 libncursesw6_6.6+20251231-1 libnettle8t64_3.10.2-1 libnghttp2-14_1.64.0-1.1+b1 libnghttp2-dev_1.64.0-1.1+b1 libnghttp3-9_1.12.0-1 libnghttp3-dev_1.12.0-1 libngtcp2-16_1.16.0-1 libngtcp2-crypto-ossl-dev_1.16.0-1 libngtcp2-crypto-ossl0_1.16.0-1 libngtcp2-dev_1.16.0-1 libnl-3-200_3.12.0-2 libnl-3-dev_3.12.0-2 libnl-route-3-200_3.12.0-2 libnl-route-3-dev_3.12.0-2 libnuma-dev_2.0.19-1+b1 libnuma1_2.0.19-1+b1 libopenblas64-0_0.3.30+ds-3+b1 libopenblas64-0-pthread_0.3.30+ds-3+b1 libopenblas64-dev_0.3.30+ds-3+b1 libopenblas64-pthread-dev_0.3.30+ds-3+b1 libopenmpi-dev_5.0.9-1 libopenmpi40_5.0.9-1 libp11-kit-dev_0.25.10-1+b1 libp11-kit0_0.25.10-1+b1 libpam-modules_1.7.0-5+b1 libpam-modules-bin_1.7.0-5+b1 libpam-runtime_1.7.0-5 libpam0g_1.7.0-5+b1 libparpack2-dev_3.9.1-6+b1 libparpack2t64_3.9.1-6+b1 libparu1_1:7.12.1+dfsg-1 libpciaccess0_0.17-3+b4 libpcre2-8-0_10.46-1+b1 libperl5.40_5.40.1-7 libpetsc-complex3.24_3.24.3+dfsg1-1 libpetsc-complex3.24-dev_3.24.3+dfsg1-1 libpetsc-real3.24_3.24.3+dfsg1-1 libpetsc-real3.24-dev_3.24.3+dfsg1-1 libpetsc3.24-dev-common_3.24.3+dfsg1-1 libpetsc3.24-dev-examples_3.24.3+dfsg1-1 libpetsc64-complex3.24_3.24.3+dfsg1-1 libpetsc64-complex3.24-dev_3.24.3+dfsg1-1 libpetsc64-real3.24_3.24.3+dfsg1-1 libpetsc64-real3.24-dev_3.24.3+dfsg1-1 libpipeline1_1.5.8-2 libpkgconf3_1.8.1-4+b1 libpmix2t64_6.0.0+really5.0.9-3 libpsl-dev_0.21.2-1.1+b2 libpsl5t64_0.21.2-1.1+b2 libptscotch-64-7.0_7.0.10-7 libptscotch-64-dev_7.0.10-7 libptscotch-64i-7.0_7.0.10-7 libptscotch-64i-dev_7.0.10-7 libptscotch-7.0c_7.0.10-7 libptscotch-dev_7.0.10-7 libptscotcherr-7.0_7.0.10-7 libptscotcherr-dev_7.0.10-7 libpython3-stdlib_3.13.9-3 libpython3.13-minimal_3.13.11-1+b1 libpython3.13-stdlib_3.13.11-1+b1 librbio4_1:7.12.1+dfsg-1 librdmacm1t64_61.0-2 libreadline8t64_8.3-3+b1 librtmp-dev_2.4+20151223.gitfa8646d.1-3+b1 librtmp1_2.4+20151223.gitfa8646d.1-3+b1 libsasl2-2_2.1.28+dfsg1-10 libsasl2-modules-db_2.1.28+dfsg1-10 libscalapack-mpi-dev_2.2.2-5 libscalapack-openmpi-dev_2.2.2-5 libscalapack-openmpi2.2_2.2.2-5 libscotch-64-7.0_7.0.10-7 libscotch-64-dev_7.0.10-7 libscotch-64i-7.0_7.0.10-7 libscotch-64i-dev_7.0.10-7 libscotch-7.0c_7.0.10-7 libscotch-dev_7.0.10-7 libscotcherr-7.0_7.0.10-7 libscotcherr-dev_7.0.10-7 libselinux1_3.9-4+b1 libsemanage-common_3.9-1 libsemanage2_3.9-1+b1 libsepol2_3.9-2 libsframe3_2.45.50.20260119-1 libsmartcols1_2.41.3-3 libspex3_1:7.12.1+dfsg-1 libspqr4_1:7.12.1+dfsg-1 libsqlite3-0_3.46.1-9 libssh2-1-dev_1.11.1-1+b1 libssh2-1t64_1.11.1-1+b1 libssl-dev_3.5.4-1+b1 libssl3t64_3.5.4-1+b1 libstdc++-15-dev_15.2.0-12 libstdc++6_15.2.0-12 libsuitesparse-dev_1:7.12.1+dfsg-1 libsuitesparse-mongoose3_1:7.12.1+dfsg-1 libsuitesparseconfig7_1:7.12.1+dfsg-1 libsuperlu-dev_7.0.1+dfsg1-2+b1 libsuperlu-dist-dev_9.2.1+dfsg1-1 libsuperlu-dist9_9.2.1+dfsg1-1 libsuperlu7_7.0.1+dfsg1-2+b1 libsystemd0_259-1 libsz2_1.1.4-2+b1 libtasn1-6_4.21.0-2 libtasn1-6-dev_4.21.0-2 libtinfo6_6.6+20251231-1 libtool_2.5.4-9 libtsan2_15.2.0-12 libubsan1_15.2.0-12 libuchardet0_0.0.8-2+b1 libucx0_1.20.0+ds-4 libudev1_259-1 libumfpack6_1:7.12.1+dfsg-1 libunbound8_1.24.2-1 libunistring5_1.3-2+b1 libuuid1_2.41.3-3 libx11-6_2:1.8.12-1+b1 libx11-data_2:1.8.12-1 libx11-dev_2:1.8.12-1+b1 libxau-dev_1:1.0.11-1+b1 libxau6_1:1.0.11-1+b1 libxcb1_1.17.0-2+b2 libxcb1-dev_1.17.0-2+b2 libxdmcp-dev_1:1.1.5-2 libxdmcp6_1:1.1.5-2 libxext6_2:1.3.4-1+b4 libxml2-16_2.15.1+dfsg-2+b1 libxnvctrl0_535.171.04-1+b3 libyaml-0-2_0.2.5-2+b1 libyaml-dev_0.2.5-2+b1 libzstd-dev_1.5.7+dfsg-3 libzstd1_1.5.7+dfsg-3 linux-libc-dev_6.18.5-1 login.defs_1:4.19.0-4 m4_1.4.20-2 make_4.4.1-3 man-db_2.13.1-1 mawk_1.3.4.20250131-2 media-types_14.0.0 mpi-default-bin_1.20 mpi-default-dev_1.20 ncurses-base_6.6+20251231-1 ncurses-bin_6.6+20251231-1 netbase_6.5 nettle-dev_3.10.2-1 ocl-icd-libopencl1_2.3.4-1 ocl-icd-opencl-dev_2.3.4-1 opencl-c-headers_3.0~2025.07.22-2 opencl-clhpp-headers_3.0~2025.07.22-1 openmpi-bin_5.0.9-1 openmpi-common_5.0.9-1 openssh-client_1:10.2p1-3 openssl-provider-legacy_3.5.4-1+b1 passwd_1:4.19.0-4 patch_2.8-2 patchelf_0.18.0-1.4 perl_5.40.1-7 perl-base_5.40.1-7 perl-modules-5.40_5.40.1-7 pkgconf_1.8.1-4+b1 pkgconf-bin_1.8.1-4+b1 po-debconf_1.0.22 python3_3.13.9-3 python3-click_8.2.0+0.really.8.1.8-1 python3-magic_2:0.4.27-3 python3-minimal_3.13.9-3 python3.13_3.13.11-1+b1 python3.13-minimal_3.13.11-1+b1 readline-common_8.3-3 rpcsvc-proto_1.4.3-1+b2 sed_4.9-2 sensible-utils_0.0.26 sysvinit-utils_3.15-6 tar_1.35+dfsg-3.1 tzdata_2025c-3 util-linux_2.41.3-3 x11proto-dev_2024.1-1 xorg-sgml-doctools_1:1.11-1.1 xtrans-dev_1.6.0-1 xz-utils_5.8.2-2 zlib1g_1:1.3.dfsg+really1.3.1-1+b2 zlib1g-dev_1:1.3.dfsg+really1.3.1-1+b2 +------------------------------------------------------------------------------+ | Build Sat, 31 Jan 2026 13:03:14 +0000 | +------------------------------------------------------------------------------+ Unpack source ------------- -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 3.0 (quilt) Source: slepc Binary: slepc-dev, libslepc-real-dev, libslepc-complex-dev, libslepc-real3.24-dev, libslepc3.24-dev-examples, libslepc-real3.24, slepc3.24-doc, libslepc-complex3.24-dev, libslepc-complex3.24, slepc64-dev, libslepc64-real-dev, libslepc64-complex-dev, libslepc64-real3.24-dev, libslepc64-real3.24, libslepc64-complex3.24-dev, libslepc64-complex3.24 Architecture: any all Version: 3.24.2+dfsg1-1 Maintainer: Debian Science Maintainers Uploaders: "Adam C. Powell, IV" , Drew Parsons , Francesco Ballarin Homepage: http://slepc.upv.es/ Standards-Version: 4.7.3 Vcs-Browser: https://salsa.debian.org/science-team/slepc Vcs-Git: https://salsa.debian.org/science-team/slepc.git Testsuite: autopkgtest Testsuite-Triggers: @builddeps@ Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev Package-List: libslepc-complex-dev deb libdevel optional arch=any libslepc-complex3.24 deb libs optional arch=any libslepc-complex3.24-dev deb libdevel optional arch=any libslepc-real-dev deb libdevel optional arch=any libslepc-real3.24 deb libs optional arch=any libslepc-real3.24-dev deb libdevel optional arch=any libslepc3.24-dev-examples deb libdevel optional arch=all libslepc64-complex-dev deb libdevel optional arch=any libslepc64-complex3.24 deb libs optional arch=any libslepc64-complex3.24-dev deb libdevel optional arch=any libslepc64-real-dev deb libdevel optional arch=any libslepc64-real3.24 deb libs optional arch=any libslepc64-real3.24-dev deb libdevel optional arch=any slepc-dev deb libdevel optional arch=any slepc3.24-doc deb doc optional arch=all slepc64-dev deb libdevel optional arch=any Checksums-Sha1: ecf65d581bab7c1bfb30f212056b6c63c96b979d 23567516 slepc_3.24.2+dfsg1.orig.tar.xz a28ea46c5002e6f35ff78bef0f83ee8c06f87bce 21444 slepc_3.24.2+dfsg1-1.debian.tar.xz Checksums-Sha256: 8a89011a61d16fe68b092137c751c91c7c2944b5dfb9d79e0fe22043c360ce71 23567516 slepc_3.24.2+dfsg1.orig.tar.xz 381141cfdc38cdc79695642b4b3dac17f6f278662a8b48e21ba286c3649e906a 21444 slepc_3.24.2+dfsg1-1.debian.tar.xz Files: 4f81239d8a4d871cb3b01429dfc83433 23567516 slepc_3.24.2+dfsg1.orig.tar.xz f7fad975768871659f5d31c05edaa4c2 21444 slepc_3.24.2+dfsg1-1.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQJIBAEBCgAyFiEEI8mpPlhYGekSbQo2Vz7x5L1aAfoFAml2ZKsUHGRwYXJzb25z QGRlYmlhbi5vcmcACgkQVz7x5L1aAfqbTBAAjXEVD9iRN3+kv6L7u7RV8YiCilfn C8UGoUgNqEV4t2FIDJ/D99ird6ZME+wsWkoLGSbTxPyKzuOJrov49dqI5iW4LB+b dapdSMPD9zg17xc6csnsa567+4SD57XW1PHQtUzkTpUG3+mj/cMjYUExNyCGYGRX mac2wcfc/AXcc5TN4AeIjmqdlwBfuPWYbME/DwFY+kE8lDk+BQW/ECuZuWxmeqdL iIglXviAIUZZZTK3hkG3cyYKwrzCymQ8hrsXHdtG5IBFJk0no3C19WGStGqW/Ziu +oYFR+DRH7USeIU3cU1pOA9TASSn1GmfYsoDVQIiRvGSwUXiFph4NKmWg4QP7Tib 4tcpFgIoSnV0IdbKbynLIejcHP8nz2mRxy6eHrEWX1oiW2//aTUDthTauVNtn2KY R5OOgHreYfpW5KKNTGkmuDZh9VitubT2A6c4p/Tml0AkoTMD9n+0hSqh3pEsFyCY fynjfjlKFcprzMDWmKxPCjksyUhkmqEnNafX7F6tAxlChPmxwnigdJ+4ol2cn3fj GwwWekXkIjdIE/EI0SgefaNU/YbXD5WvpQhZkDYJn4AGG1z+pMvZfNlifFowPKas iasc8Cp+CcJNbjHfYxaw58ouN47n+tMOLnk2Cv8mcPpTxIJxKU7BQ4aqrWRsNJ/4 7jKnCCxYXeL6GkU= =KWzi -----END PGP SIGNATURE----- dpkg-source: warning: cannot verify inline signature for ./slepc_3.24.2+dfsg1-1.dsc: missing OpenPGP keyrings dpkg-source: info: verifying ./slepc_3.24.2+dfsg1-1.dsc dpkg-source: info: skipping absent keyring /usr/share/keyrings/debian-keyring.pgp dpkg-source: info: skipping absent keyring /usr/share/keyrings/debian-tag2upload.pgp dpkg-source: info: skipping absent keyring /usr/share/keyrings/debian-nonupload.pgp dpkg-source: info: skipping absent keyring /usr/share/keyrings/debian-maintainers.pgp dpkg-source: info: extracting slepc in /build/reproducible-path/slepc-3.24.2+dfsg1 dpkg-source: info: unpacking slepc_3.24.2+dfsg1.orig.tar.xz dpkg-source: info: unpacking slepc_3.24.2+dfsg1-1.debian.tar.xz dpkg-source: info: using patch list from debian/patches/series dpkg-source: info: applying double_colon_patch dpkg-source: info: applying configure_python3.patch dpkg-source: info: applying build_suffix.patch dpkg-source: info: applying skip_test7f.patch dpkg-source: info: applying test_nox.patch dpkg-source: info: applying ignore_git_hidden_folder_in_config_slepc_py.patch Check disk space ---------------- Sufficient free space for build User Environment ---------------- APT_CONFIG=/var/lib/sbuild/apt.conf DEB_BUILD_OPTIONS=parallel=4 HOME=/sbuild-nonexistent LANG=C.UTF-8 LC_ALL=C.UTF-8 LC_COLLATE=C.UTF-8 LC_CTYPE=C.UTF-8 LOGNAME=sbuild PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games SHELL=/bin/sh SOURCE_DATE_EPOCH=1769366661 USER=sbuild dpkg-buildpackage ----------------- Command: dpkg-buildpackage --sanitize-env -us -uc -B dpkg-buildpackage: info: source package slepc dpkg-buildpackage: info: source version 3.24.2+dfsg1-1 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Drew Parsons dpkg-source --before-build . dpkg-buildpackage: info: host architecture riscv64 debian/rules clean dh clean --with python3,fortran_mod debian/rules override_dh_auto_clean make[1]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' if [ -d installed-arch-linux2-c-opt ]; then \ dh_auto_clean -plibslepc-real3.24-dev -pslepc3.24-doc -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex; \ fi if [ -d installed-arch-linux2-c-opt-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex; \ fi make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' debian/rules override_dh_clean make[1]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' if [ -d installed-arch-linux2-c-opt ]; then \ dh_auto_clean -plibslepc-real3.24-dev -pslepc3.24-doc -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex; \ fi if [ -d installed-arch-linux2-c-opt-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex; \ fi dh_clean rm -rf installed-* rm -f lib/slepc/conf/slepcvariables rm -f make.log configure.log find config -name *.pyc | xargs rm -f rm -rf installed-arch-linux2-c-opt installed-arch-linux2-c-opt-complex installed-arch-linux2-c-opt-64 installed-arch-linux2-c-opt-complex-64 make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' debian/rules binary-arch dh binary-arch --with python3,fortran_mod dh_update_autotools_config -a dh_autoreconf -a debian/rules override_dh_auto_configure make[1]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' if PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real \ ./configure --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real \ --with-arpack=1 ; then \ : ; \ else \ err=$?; \ echo "real configure failed with exit value $err"; \ echo "===== show real configure.log ====="; \ cat installed-arch-linux2-c-opt/lib/slepc/conf/configure.log; \ echo "===== end real configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Checking ARPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real Prefix install with double precision real numbers SCALAPACK from SCALAPACK linked by PETSc ARPACK library flags: -lparpack -larpack xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex \ ./configure --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex \ --with-arpack=1 ; then \ : ; \ else \ err=$?; \ echo "complex configure failed with exit value $err"; \ echo "===== show complex configure.log ====="; \ cat installed-arch-linux2-c-opt-complex/lib/slepc/conf/configure.log; \ echo "===== end complex configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Checking ARPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex Prefix install with double precision complex numbers SCALAPACK from SCALAPACK linked by PETSc ARPACK library flags: -lparpack -larpack xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real \ ./configure --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real \ --build-suffix="64" ; then \ : ; \ else \ err=$?; \ echo "64-bit real configure failed with exit value $err"; \ echo "===== show 64-bit real configure.log ====="; \ cat installed-arch-linux2-c-opt-64/lib/slepc/conf/configure.log; \ echo "===== end 64-bit real configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real Prefix install with double precision real numbers SCALAPACK from SCALAPACK linked by PETSc xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex \ ./configure --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex \ --build-suffix="64" ; then \ : ; \ else \ err=$?; \ echo "64-bit complex configure failed with exit value $err"; \ echo "===== show 64-bit complex configure.log ====="; \ cat installed-arch-linux2-c-opt-complex-64/lib/slepc/conf/configure.log; \ echo "===== end 64-bit complex configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex Prefix install with double precision complex numbers SCALAPACK from SCALAPACK linked by PETSc xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex xxx==========================================================================xxx make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' debian/rules override_dh_auto_build make[1]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' dh_auto_build -plibslepc-real3.24-dev -pslepc3.24-doc -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt make[2]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================== Starting make run on sbuild at Sat, 31 Jan 2026 13:04:14 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real Using PETSc arch: installed-arch-linux2-c-opt ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 2 SLEPC_VERSION_DATE "Jan 20, 2026" SLEPC_VERSION_GIT "v3.24.2" SLEPC_VERSION_DATE_GIT "2026-01-20 12:27:56 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real --with-arpack=1 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.2+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_ARPACK 1 #define SLEPC_ARPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:arpack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 3 PETSC_VERSION_DATE "Jan 01, 2026" PETSC_VERSION_GIT "v3.24.3" PETSC_VERSION_DATE_GIT "2026-01-01 17:01:02 -0600" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --with-library-name-suffix=_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lscotch -lptscotcherr" --with-hypre=1 --with-hypre-include=/usr/include/hypre --with-hypre-lib=-lHYPRE --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --prefix=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real --PETSC_ARCH=riscv64-linux-gnu-real CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_HYPRE 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:hypre:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:superlu:superlu_dist:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SUPERLU 1 #define PETSC_HAVE_SUPERLU_DIST 1 #define PETSC_HAVE_SUPERLU_DIST_SINGLE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib" #define PETSC_LIB_NAME_SUFFIX "_real" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-12) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-12) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-12) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/lib -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/lib -lslepc_real -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_real -lHYPRE -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo1972 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.2+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/finitf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dlaed3m.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dlaed3m.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dmerg2.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dmerg2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dibtdc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dibtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsbtdc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dsbtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsrtdf.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dsrtdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/arpack/arpack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/external/arpack/arpack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc_real.so.3.24 -o installed-arch-linux2-c-opt/lib/libslepc_real.so.3.24.2 @installed-arch-linux2-c-opt/lib/libslepc_real.so.3.24.2.args -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_real -lHYPRE -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' dh_auto_build -plibslepc-complex3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex make[2]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================== Starting make run on sbuild at Sat, 31 Jan 2026 13:08:58 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex Using PETSc arch: installed-arch-linux2-c-opt-complex ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 2 SLEPC_VERSION_DATE "Jan 20, 2026" SLEPC_VERSION_GIT "v3.24.2" SLEPC_VERSION_DATE_GIT "2026-01-20 12:27:56 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex --with-arpack=1 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.2+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_ARPACK 1 #define SLEPC_ARPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:arpack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 3 PETSC_VERSION_DATE "Jan 01, 2026" PETSC_VERSION_GIT "v3.24.3" PETSC_VERSION_DATE_GIT "2026-01-01 17:01:02 -0600" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --with-scalar-type=complex --with-library-name-suffix=_complex --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lscotch -lptscotcherr" --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --prefix=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex --PETSC_ARCH=riscv64-linux-gnu-complex CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:superlu:superlu_dist:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SUPERLU 1 #define PETSC_HAVE_SUPERLU_DIST 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib" #define PETSC_LIB_NAME_SUFFIX "_complex" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_COMPLEX 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-12) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-12) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-12) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/lib -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/lib -lslepc_complex -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo3833 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.2+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-complex mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsprivf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/arpack/arpack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/external/arpack/arpack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/ciss/pcissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/ciss/pcissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/ciss/pciss.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/ciss/pciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/ciss/ncissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/ciss/ncissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/ciss/nciss.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/ciss/nciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc_complex.so.3.24 -o installed-arch-linux2-c-opt-complex/lib/libslepc_complex.so.3.24.2 @installed-arch-linux2-c-opt-complex/lib/libslepc_complex.so.3.24.2.args -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' dh_auto_build -plibslepc64-real3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 make[2]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================== Starting make run on sbuild at Sat, 31 Jan 2026 13:13:42 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real Using PETSc arch: installed-arch-linux2-c-opt-64 ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 2 SLEPC_VERSION_DATE "Jan 20, 2026" SLEPC_VERSION_GIT "v3.24.2" SLEPC_VERSION_DATE_GIT "2026-01-20 12:27:56 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real --build-suffix=64 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.2+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 3 PETSC_VERSION_DATE "Jan 01, 2026" PETSC_VERSION_GIT "v3.24.3" PETSC_VERSION_DATE_GIT "2026-01-01 17:01:02 -0600" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-64-bit-indices --with-debugging=0 --with-library-name-suffix=64_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch_64i --with-ptscotch-lib="-lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr" --with-hypre=1 --with-hypre-include=/usr/include/hypre64 --with-hypre-lib=-lHYPRE64 --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --prefix=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real --PETSC_ARCH=riscv64-linux-gnu-real-64 CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_HYPRE 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:hypre:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib" #define PETSC_LIB_NAME_SUFFIX "64_real" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_64BIT_INDICES 1 #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-12) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-12) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-12) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/lib -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/lib -lslepc64_real -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_real -lHYPRE64 -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo5696 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.2+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-64 mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dlaed3m.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dlaed3m.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dibtdc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dibtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsbtdc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dsbtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dmerg2.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dmerg2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsrtdf.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dsrtdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-64/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-64/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-64/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-64/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc64_real.so.3.24 -o installed-arch-linux2-c-opt-64/lib/libslepc64_real.so.3.24.2 @installed-arch-linux2-c-opt-64/lib/libslepc64_real.so.3.24.2.args -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_real -lHYPRE64 -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' dh_auto_build -plibslepc64-complex3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 make[2]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================== Starting make run on sbuild at Sat, 31 Jan 2026 13:18:33 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.2+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex Using PETSc arch: installed-arch-linux2-c-opt-complex-64 ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 2 SLEPC_VERSION_DATE "Jan 20, 2026" SLEPC_VERSION_GIT "v3.24.2" SLEPC_VERSION_DATE_GIT "2026-01-20 12:27:56 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex --build-suffix=64 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.2+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 3 PETSC_VERSION_DATE "Jan 01, 2026" PETSC_VERSION_GIT "v3.24.3" PETSC_VERSION_DATE_GIT "2026-01-01 17:01:02 -0600" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-64-bit-indices --with-debugging=0 --with-scalar-type=complex --with-library-name-suffix=64_complex --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch_64i --with-ptscotch-lib="-lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr" --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --prefix=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex --PETSC_ARCH=riscv64-linux-gnu-complex-64 CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib" #define PETSC_LIB_NAME_SUFFIX "64_complex" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_64BIT_INDICES 1 #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_COMPLEX 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-12) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-12) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-12) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/lib -L/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/lib -lslepc64_complex -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo7551 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.2+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-complex-64 mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/ciss/pcissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/ciss/pcissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/ciss/pciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/ciss/pciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/ciss/ncissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/ciss/ncissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/ciss/nciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/ciss/nciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.2+dfsg1/include -I/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.2+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.3+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc64_complex.so.3.24 -o installed-arch-linux2-c-opt-complex-64/lib/libslepc64_complex.so.3.24.2 @installed-arch-linux2-c-opt-complex-64/lib/libslepc64_complex.so.3.24.2.args -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.2+dfsg1' debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' set -e; \ if [ "yes" = "no" ]; then \ echo Tests have been disabled on riscv64; \ else \ dh_auto_test -plibslepc-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root"; \ dh_auto_test -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" ; \ dh_auto_test -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-64/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root"; \ dh_auto_test -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt-complex-64/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" ; \ fi make -j4 test TESTSUITEFLAGS="-j4 --verbose" VERBOSE=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.2\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2\+dfsg1/installed-arch-linux2-c-opt/lib OMP_NUM_THREADS=1 MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" make[2]: Entering directory '/build/reproducible-path/slepc-3.24.2+dfsg1' Using MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo9662 -- MPIEXEC=mpiexec --oversubscribe --allow-run-as-root OMP_NUM_THREADS=1 LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.2+dfsg1/installed-arch-linux2-c-opt/lib PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real SLEPC_DIR=/build/reproducible-path/slepc-3.24.2+dfsg1 VERBOSE=1 TESTSUITEFLAGS=-j4 --verbose Use "/usr/bin/make V=1" to see verbose compile lines, "/usr/bin/make V=0" to suppress. RM test-rm-sys.cu RM test-rm-sys.cxx RM test-rm-sys.F FC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1f.o FC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test14f.o FC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1f.o FC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7f.o FC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1f.o RM test-rm-sys.kokkos.cxx RM test-rm-sys.hip.cpp RM test-rm-sys.sycl.cxx RM test-rm-sys.raja.cxx RM test-rm-eps.cu RM test-rm-eps.cxx RM test-rm-eps.F FC installed-arch-linux2-c-opt/tests/eps/tests/test14f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test15f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test17f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test7f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex10f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex1f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex6f.o RM test-rm-eps.kokkos.cxx RM test-rm-eps.hip.cpp RM test-rm-eps.sycl.cxx RM test-rm-eps.raja.cxx RM test-rm-svd.cu RM test-rm-svd.cxx RM test-rm-svd.F FC installed-arch-linux2-c-opt/tests/svd/tests/test4f.o FC installed-arch-linux2-c-opt/tests/svd/tutorials/ex15f.o RM test-rm-svd.kokkos.cxx RM test-rm-svd.hip.cpp RM test-rm-svd.sycl.cxx RM test-rm-svd.raja.cxx RM test-rm-pep.cu RM test-rm-pep.cxx RM test-rm-pep.F FC installed-arch-linux2-c-opt/tests/pep/tests/test3f.o FC installed-arch-linux2-c-opt/tests/pep/tutorials/ex16f.o RM test-rm-pep.kokkos.cxx RM test-rm-pep.hip.cpp RM test-rm-pep.sycl.cxx RM test-rm-pep.raja.cxx RM test-rm-nep.cu RM test-rm-nep.cxx RM test-rm-nep.F FC installed-arch-linux2-c-opt/tests/nep/tests/test2f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex20f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex22f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex27f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex54f.o RM test-rm-nep.kokkos.cxx RM test-rm-nep.hip.cpp RM test-rm-nep.sycl.cxx RM test-rm-nep.raja.cxx RM test-rm-mfn.cu RM test-rm-mfn.cxx RM test-rm-mfn.F FC installed-arch-linux2-c-opt/tests/mfn/tests/test3f.o FC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23f.o RM test-rm-mfn.kokkos.cxx RM test-rm-mfn.hip.cpp RM test-rm-mfn.sycl.cxx RM test-rm-mfn.raja.cxx RM test-rm-lme.cu RM test-rm-lme.cxx RM test-rm-lme.F RM test-rm-lme.F90 RM test-rm-lme.kokkos.cxx RM test-rm-lme.hip.cpp RM test-rm-lme.sycl.cxx RM test-rm-lme.raja.cxx CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test10.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test11.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test14.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test15.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test16.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test17.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test18.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test19.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test15.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test16.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test17.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test18.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test19.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test20.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test21.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test22.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test23.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test24.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test25.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test26.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test27.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test10.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test11.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/mat/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/tutorials/ex33.o CC installed-arch-linux2-c-opt/tests/sys/vec/tests/test1.o FLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test14f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1f CC installed-arch-linux2-c-opt/tests/eps/tests/test1.o CC installed-arch-linux2-c-opt/tests/eps/tests/test10.o CC installed-arch-linux2-c-opt/tests/eps/tests/test11.o CC installed-arch-linux2-c-opt/tests/eps/tests/test12.o CC installed-arch-linux2-c-opt/tests/eps/tests/test13.o CC installed-arch-linux2-c-opt/tests/eps/tests/test14.o CC installed-arch-linux2-c-opt/tests/eps/tests/test16.o CC installed-arch-linux2-c-opt/tests/eps/tests/test17.o CC installed-arch-linux2-c-opt/tests/eps/tests/test18.o CC installed-arch-linux2-c-opt/tests/eps/tests/test19.o CC installed-arch-linux2-c-opt/tests/eps/tests/test2.o CC installed-arch-linux2-c-opt/tests/eps/tests/test20.o CC installed-arch-linux2-c-opt/tests/eps/tests/test21.o CC installed-arch-linux2-c-opt/tests/eps/tests/test22.o CC installed-arch-linux2-c-opt/tests/eps/tests/test23.o CC installed-arch-linux2-c-opt/tests/eps/tests/test24.o CC installed-arch-linux2-c-opt/tests/eps/tests/test25.o CC installed-arch-linux2-c-opt/tests/eps/tests/test26.o CC installed-arch-linux2-c-opt/tests/eps/tests/test27.o CC installed-arch-linux2-c-opt/tests/eps/tests/test28.o CC installed-arch-linux2-c-opt/tests/eps/tests/test29.o CC installed-arch-linux2-c-opt/tests/eps/tests/test3.o CC installed-arch-linux2-c-opt/tests/eps/tests/test30.o CC installed-arch-linux2-c-opt/tests/eps/tests/test31.o CC installed-arch-linux2-c-opt/tests/eps/tests/test32.o CC installed-arch-linux2-c-opt/tests/eps/tests/test37.o CC installed-arch-linux2-c-opt/tests/eps/tests/test38.o CC installed-arch-linux2-c-opt/tests/eps/tests/test39.o CC installed-arch-linux2-c-opt/tests/eps/tests/test4.o CC installed-arch-linux2-c-opt/tests/eps/tests/test40.o CC installed-arch-linux2-c-opt/tests/eps/tests/test44.o CC installed-arch-linux2-c-opt/tests/eps/tests/test5.o CC installed-arch-linux2-c-opt/tests/eps/tests/test6.o CC installed-arch-linux2-c-opt/tests/eps/tests/test8.o CC installed-arch-linux2-c-opt/tests/eps/tests/test9.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex10.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex11.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex12.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex13.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex18.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex19.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex2.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex24.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex25.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex29.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex3.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex30.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex31.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex34.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex35.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex36.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex4.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex41.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex43.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex44.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex46.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex47.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex49.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex5.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex55.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex56.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex57.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex7.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex9.o FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test14f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test15f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test17f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test7f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex10f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex1f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex6f CC installed-arch-linux2-c-opt/tests/svd/tests/test1.o CC installed-arch-linux2-c-opt/tests/svd/tests/test10.o CC installed-arch-linux2-c-opt/tests/svd/tests/test11.o CC installed-arch-linux2-c-opt/tests/svd/tests/test12.o CC installed-arch-linux2-c-opt/tests/svd/tests/test14.o CC installed-arch-linux2-c-opt/tests/svd/tests/test15.o CC installed-arch-linux2-c-opt/tests/svd/tests/test16.o CC installed-arch-linux2-c-opt/tests/svd/tests/test18.o CC installed-arch-linux2-c-opt/tests/svd/tests/test19.o CC installed-arch-linux2-c-opt/tests/svd/tests/test2.o CC installed-arch-linux2-c-opt/tests/svd/tests/test20.o CC installed-arch-linux2-c-opt/tests/svd/tests/test3.o CC installed-arch-linux2-c-opt/tests/svd/tests/test4.o CC installed-arch-linux2-c-opt/tests/svd/tests/test5.o CC installed-arch-linux2-c-opt/tests/svd/tests/test6.o CC installed-arch-linux2-c-opt/tests/svd/tests/test7.o CC installed-arch-linux2-c-opt/tests/svd/tests/test8.o CC installed-arch-linux2-c-opt/tests/svd/tests/test9.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex14.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex15.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex45.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex48.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex51.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex52.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex53.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex8.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/network.o FLINKER installed-arch-linux2-c-opt/tests/svd/tests/test4f FLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex15f CC installed-arch-linux2-c-opt/tests/pep/tests/test1.o CC installed-arch-linux2-c-opt/tests/pep/tests/test10.o CC installed-arch-linux2-c-opt/tests/pep/tests/test11.o CC installed-arch-linux2-c-opt/tests/pep/tests/test12.o CC installed-arch-linux2-c-opt/tests/pep/tests/test2.o CC installed-arch-linux2-c-opt/tests/pep/tests/test3.o CC installed-arch-linux2-c-opt/tests/pep/tests/test4.o CC installed-arch-linux2-c-opt/tests/pep/tests/test5.o CC installed-arch-linux2-c-opt/tests/pep/tests/test6.o CC installed-arch-linux2-c-opt/tests/pep/tests/test7.o CC installed-arch-linux2-c-opt/tests/pep/tests/test8.o CC installed-arch-linux2-c-opt/tests/pep/tests/test9.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex16.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex17.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex28.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex38.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex40.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex50.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_1d.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_2d.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/butterfly.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/damped_beam.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/loaded_string.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/planar_waveguide.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/sleeper.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/spring.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/wiresaw.o FLINKER installed-arch-linux2-c-opt/tests/pep/tests/test3f FLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex16f CC installed-arch-linux2-c-opt/tests/nep/tests/test1.o CC installed-arch-linux2-c-opt/tests/nep/tests/test10.o CC installed-arch-linux2-c-opt/tests/nep/tests/test12.o CC installed-arch-linux2-c-opt/tests/nep/tests/test13.o CC installed-arch-linux2-c-opt/tests/nep/tests/test14.o CC installed-arch-linux2-c-opt/tests/nep/tests/test15.o CC installed-arch-linux2-c-opt/tests/nep/tests/test16.o CC installed-arch-linux2-c-opt/tests/nep/tests/test17.o CC installed-arch-linux2-c-opt/tests/nep/tests/test2.o CC installed-arch-linux2-c-opt/tests/nep/tests/test3.o CC installed-arch-linux2-c-opt/tests/nep/tests/test4.o CC installed-arch-linux2-c-opt/tests/nep/tests/test5.o CC installed-arch-linux2-c-opt/tests/nep/tests/test6.o CC installed-arch-linux2-c-opt/tests/nep/tests/test7.o CC installed-arch-linux2-c-opt/tests/nep/tests/test8.o CC installed-arch-linux2-c-opt/tests/nep/tests/test9.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex20.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex21.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex22.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex27.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex42.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/nlevp/loaded_string.o FLINKER installed-arch-linux2-c-opt/tests/nep/tests/test2f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex20f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex22f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex27f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex54f CC installed-arch-linux2-c-opt/tests/mfn/tests/test1.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test2.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test3.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test4.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test5.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex26.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex37.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex39.o FLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test3f FLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23f CC installed-arch-linux2-c-opt/tests/lme/tests/test1.o CC installed-arch-linux2-c-opt/tests/lme/tests/test2.o CC installed-arch-linux2-c-opt/tests/lme/tutorials/ex32.o CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test21 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test22 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test23 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test24 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test25 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test26 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test27 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/mat/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/tutorials/ex33 CLINKER installed-arch-linux2-c-opt/tests/sys/vec/tests/test1 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-vecs.counts not ok sys_classes_bv_tests-test1f_1_bv_type-vecs # Error code: 14 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11242] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbdd2b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11245] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11245] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-contiguous.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-svec.counts not ok sys_classes_bv_tests-test1f_1_bv_type-contiguous # Error code: 14 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11273] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9b82e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11276] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11276] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-contiguous # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-mat.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-vecs.counts not ok sys_classes_bv_tests-test1f_1_bv_type-svec # Error code: 14 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11284] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f89313000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11289] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11289] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1f_1_bv_type-svec # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-contiguous.counts not ok sys_classes_bv_tests-test1f_1_bv_type-mat # Error code: 14 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb8098000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11319] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11319] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-svec.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-mat.counts not ok sys_classes_bv_tests-test1f_2_bv_type-svec # Error code: 14 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb68ac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11397] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11391] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11395] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11397] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11395] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11391@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test1f_2_bv_type-svec # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test14f_1.counts not ok sys_classes_bv_tests-test1f_2_bv_type-vecs # Error code: 14 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb2452000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11354] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11354] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11340@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1f_2_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test1f_1.counts not ok sys_classes_ds_tests-test14f_1 # Error code: 14 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11444] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5d12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11465] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11465] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test14f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7f_1.counts not ok sys_classes_fn_tests-test1f_1 # Error code: 14 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11472] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f85da5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test1f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1f_1.counts not ok sys_classes_bv_tests-test1f_2_bv_type-contiguous # Error code: 14 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f88448000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11362] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11361] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11362] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11361] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11356@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test1f_2_bv_type-contiguous # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test1 not ok sys_classes_fn_tests-test7f_1+fn_method-0 # Error code: 14 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11509] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8af5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11527] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11527] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test1f_1 # Error code: 14 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8f844000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11539] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11539] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test1f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test10 not ok sys_classes_bv_tests-test1f_2_bv_type-mat # Error code: 14 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb90b1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11415] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11414] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11415] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11414] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11407@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test1f_2_bv_type-mat # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test11 not ok sys_classes_fn_tests-test7f_1+fn_method-1 # Error code: 14 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11572] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fac8bd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11580] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7f_1+fn_method-2 # Error code: 14 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8b7a4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11634] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11634] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7f_1+fn_method-3 # Error code: 14 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f82465000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11651] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11651] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test21 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test22 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test23 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test24 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test25 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test26 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test27 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test28 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test29 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test30 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test31 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test32 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test37 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test38 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test39 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test40 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test44 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex10 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex11 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex12 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex13 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex18 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex19 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex2 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex24 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex25 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex29 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex3 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex30 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex31 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex34 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex35 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex36 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex4 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex41 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex43 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex44 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex46 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex47 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex49 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex5 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex55 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex56 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex57 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex7 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex9 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test14f_1.counts not ok eps_tests-test14f_1 # Error code: 14 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11922] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa9f35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11925] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11925] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test14f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test15f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test7f_1.counts not ok eps_tests-test15f_1 # Error code: 14 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11953] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f83d44000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11963] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11963] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test15f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10f_1_sinvert.counts not ok eps_tests-test7f_1 # Error code: 14 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11974] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9022f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11981] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11981] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test7f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10f_1_shell.counts not ok eps_tutorials-ex10f_1_sinvert # Error code: 14 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12011] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f98c3e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10f_1_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex1f_1.counts not ok eps_tutorials-ex10f_1_shell+eps_two_sided-0 # Error code: 14 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12039] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fafa08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12043] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12043] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10f_1_shell # SKIP Command failed so no diff not ok eps_tutorials-ex1f_1 # Error code: 14 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12069] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8a7ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12074] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12074] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex1f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex6f_1.counts not ok eps_tutorials-ex10f_1_shell+eps_two_sided-1 # Error code: 14 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12086] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7264000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12089] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12089] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10f_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex6f_1_ts.counts CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test1 not ok eps_tutorials-ex6f_1 # Error code: 14 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f965d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12138] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12138] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex6f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test10 not ok eps_tutorials-ex6f_1_ts # Error code: 14 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12146] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f92ef8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12149] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12149] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex6f_1_ts # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex14 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex15 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex45 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex48 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex51 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex52 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex53 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex8 CC installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/embedgsvd.o TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4f_1.counts not ok svd_tests-test4f_1+svd_type-lanczos # Error code: 14 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12303] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f945d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12306] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12306] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15f_1.counts CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test1 not ok svd_tests-test4f_1+svd_type-trlanczos # Error code: 14 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb0573000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4f_1 # SKIP Command failed so no diff not ok svd_tutorials-ex15f_1 # Error code: 14 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12335] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f91a85000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12338] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12338] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex15f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test10 not ok svd_tests-test4f_1+svd_type-cross # Error code: 14 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12352] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9c37c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12375] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12375] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4f_1 # SKIP Command failed so no diff not ok svd_tests-test4f_1+svd_type-cyclic # Error code: 14 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12392] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb744c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12395] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12395] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4f_1 # SKIP Command failed so no diff not ok svd_tests-test4f_1+svd_type-randomized # Error code: 14 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12409] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fab81d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12412] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12412] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex16 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex17 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex28 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex38 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex40 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex50 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_1d CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_2d CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/butterfly CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/damped_beam CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/loaded_string CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/planar_waveguide CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/sleeper CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/spring CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/wiresaw TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test3f_1.counts not ok pep_tests-test3f_1 # Error code: 14 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb12a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12542] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12542] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test3f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16f_1.counts CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test1 not ok pep_tutorials-ex16f_1 # Error code: 14 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12573] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f90295000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12576] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex16f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex20 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex21 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex22 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex27 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex42 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/nlevp/loaded_string TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test2f_1.counts not ok nep_tests-test2f_1 # Error code: 14 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12687] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5beb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12690] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12690] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test2f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22f_1.counts not ok nep_tutorials-ex20f_1 # Error code: 14 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb8422000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12722] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12722] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex20f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27f_1.counts not ok nep_tutorials-ex22f_1 # Error code: 14 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12728] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9893000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12731] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12731] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27f_2.counts not ok nep_tutorials-ex27f_1 # Error code: 14 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f85f18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12783] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12783] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex54f_1_slp.counts not ok nep_tutorials-ex27f_2 # Error code: 14 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f98b00000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12791] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12791] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27f_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex54f_1_nleigs.counts not ok nep_tutorials-ex54f_1_slp # Error code: 14 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbee8e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12843] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12843] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex54f_1_slp # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test1 not ok nep_tutorials-ex54f_1_nleigs # Error code: 14 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f91223000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex54f_1_nleigs # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex26 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex37 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex39 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex23f_1.counts CLINKER installed-arch-linux2-c-opt/tests/lme/tests/test1 not ok mfn_tests-test3f_1 # Error code: 14 not ok mfn_tutorials-ex23f_1 # Error code: 14 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12937] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbc740000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12944] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12944] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok mfn_tests-test3f_1 # SKIP Command failed so no diff # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12941] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa69c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12947] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12947] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex23f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/lme/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/lme/tutorials/ex32 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-contiguous.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-svec.counts not ok sys_classes_bv_tests-test1_1_bv_type-vecs # Error code: 14 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13005] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8c2c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13008] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13008] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_1_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-mat.counts not ok sys_classes_bv_tests-test1_1_bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test1_1_bv_type-contiguous # Error code: 14 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13026] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9739d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13043] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13043] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_1_bv_type-svec # SKIP Command failed so no diff # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13025] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8210b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13046] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13046] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_1_bv_type-contiguous # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-contiguous.counts not ok sys_classes_bv_tests-test1_1_bv_type-mat # Error code: 14 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13057] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb8b8a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_1_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-svec.counts not ok sys_classes_bv_tests-test1_2_bv_type-vecs # Error code: 14 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13111] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2e8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13132] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13132] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_2_bv_type-vecs # SKIP Command failed so no diff not ok sys_classes_bv_tests-test1_2_bv_type-contiguous # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-mat.counts # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13120] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f92fbc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13140] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13140] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_2_bv_type-contiguous # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test10_1.counts not ok sys_classes_bv_tests-test1_2_bv_type-svec # Error code: 14 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13147] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f7f860000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13150] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13150] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_2_bv_type-svec # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_1.counts not ok sys_classes_bv_tests-test1_2_bv_type-mat # Error code: 14 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13199] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f910a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_2_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_4.counts not ok sys_classes_bv_tests-test10_1+bv_type-vecs # Error code: 14 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8910d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13211] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13232] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13230] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13232] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13230] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13211@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff # retrying sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-gs not ok sys_classes_bv_tests-test10_1+bv_type-contiguous # Error code: 14 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fae46d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13297] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13304] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13303] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13303] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13297@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-svec # Error code: 14 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbf278000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13320@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faa733000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13277] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13281] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13280] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13280] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13281] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13277@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-mat # Error code: 14 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb5139000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13343] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13344] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13343] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13344] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13340@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_6.counts not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-chol # Error code: 14 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f87294000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13360] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13359] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13360] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13356@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fbdce1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13412] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13409] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13412] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13409@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f84416000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13393] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13400] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13401] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13401] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13400] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13393@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9d9f5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13433] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13437] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13436] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13436] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13437] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13433@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-chol # Error code: 14 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa0fcb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13468] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13459] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13469] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13468] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13459@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13459] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13465] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa50ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13472] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13472] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:13465] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13465@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab4a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13509] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13509] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13496@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f94626000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13512] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13513] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13513] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13512] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13505@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f874f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13550] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13537] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13552] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13550] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13552] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13537@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fba036000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13545] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13551] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13553] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13551] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13553] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13545@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9bea2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13591] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13591] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13577@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-chol # Error code: 14 not ok sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fad0e9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13593] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13585] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13592] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13593] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13585@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fadc1e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13605] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13606] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13606] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13605] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13602@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff # retrying sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-chol not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13629] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae1a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13658] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13658] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13629@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13629] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-svqb # Error code: 14 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f83e1b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13654] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13644] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13656] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13654] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13656] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13644@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13685] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa8d33000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13701] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13700] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13701] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13700] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13685@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-gs # Error code: 14 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb90f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13697] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13705] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13704] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13705] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13704] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13697@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8a049000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13730] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13741] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13730@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbc2d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13745] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13745] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13737@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faa032000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13780] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13781] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13781] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13780] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13765@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f0b3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13784] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13777] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13785] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13784] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:13785] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13777@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_9.counts not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa0e6b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13806] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13829] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13829] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13831] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13806@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13806] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f95d25000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13858] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13861] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13862] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13861] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13862] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13858@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-chol # Error code: 14 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8f2c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13878] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13882] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13881] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13882] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13881] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13878@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13898] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa5de8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13901] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13902] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13902] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13901] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13898@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb2b0d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13918] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13921] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13922] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13921] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13922] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13918@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-chol # Error code: 14 not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-svqb # Error code: 14 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f997ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13938] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13942] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13942] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13938@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fae1e2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13947] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13939@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-gs # Error code: 14 not ok sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f81b1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13985] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13985] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13977@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f87620000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13979] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13986] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13987] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13987] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13986] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13979@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9226e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14024] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14024] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14019@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9206000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14050] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14051] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14050] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14051] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14047@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fabb97000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14026] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14018] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14026] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14027] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14018@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f951fa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14083] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14081] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14083] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14067@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:14067] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14082] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9ffbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14086] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:14086] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14087] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14082@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8c64a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14117] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14127] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14127] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14117@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f97b38000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14126] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14126] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14119] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14125] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14119@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-contiguous_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e6a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14163] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14164] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14163] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14164] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14149@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa8763000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14167] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14159] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14166] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14167] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14159@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_11.counts not ok sys_classes_bv_tests-test11_11+bv_type-vecs # Error code: 14 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faaf35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14221] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14219] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14221] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14216@1,2] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-contiguous_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9f4c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14213] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14212] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14213] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14189@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9e231000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14286] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14289] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14290] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14290] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14289] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14286@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-contiguous # Error code: 14 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14251] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9abc1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14256] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14256] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14251] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14251] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14251] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14251] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-svec # Error code: 14 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9aa09000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14325] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14327] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14329] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14328] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14329] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14325] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14327] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14320@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa4fd3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14321] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14306] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14321] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14306@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-mat # Error code: 14 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa94c7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14357] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14366] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14360] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14366] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14360] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14357@1,6] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_12.counts not ok sys_classes_bv_tests-test11_12+bv_type-vecs # Error code: 14 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f81c37000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14419] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14415] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14418] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14416] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14421] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14419] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14415] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14418] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14416] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14410@1,4] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-svec_bv_orthog_block-chol # Error code: 14 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fabc05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14411] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14391] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14411] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14412] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14391@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-svec_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14482] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb3ec1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14486] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14485] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14485] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14486] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14482@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-contiguous # Error code: 14 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14447] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8bdc2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14447] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14447] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-svec # Error code: 14 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8cbb5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14522] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14523] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14522] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14523] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14521] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14514@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-svec_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb1062000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14519] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14519] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14502] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14502@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-mat # Error code: 14 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14551] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5957000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14556] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14556] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14551] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14551] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-svec_bv_orthog_block-svqb # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test12_1.counts # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14586] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f89c29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14589] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14589] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14586@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-mat_bv_orthog_block-gs # Error code: 14 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14625] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9ff5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14635] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14634] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14634] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14635] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14625@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-vecs # Error code: 14 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9aca0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14638] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14638] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-contiguous # Error code: 14 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14668] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f7fb7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14673] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14673] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-svec # Error code: 14 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14693] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb1a3e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14696] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14696] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14662] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8de52000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14675] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14674] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14675] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14674] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14662@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-mat # Error code: 14 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14710] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9b48f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14718] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14718] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test13_1.counts not ok sys_classes_bv_tests-test11_1+bv_type-mat_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa6afa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14728] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14729] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14729] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14728] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14725@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-vecs # Error code: 14 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fad30c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-contiguous # Error code: 14 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a4bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-mat_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f85d53000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14778] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14779] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14778] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14779] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14772@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-svec # Error code: 14 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14814] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7d5a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14828] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14828] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_1+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f89fbc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14833] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14833] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14829@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test14_1.counts not ok sys_classes_bv_tests-test13_1+bv_type-mat # Error code: 14 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14851] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9a58a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14871] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14871] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test14_2.counts not ok sys_classes_bv_tests-test14_2 # Error code: 14 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14914] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f89451000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14917] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14917] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test15_1.counts not ok sys_classes_bv_tests-test14_1+bv_type-vecs # Error code: 14 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14879] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f96692000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14883] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14882] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14883] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14882] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14879@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test14_1+bv_type-contiguous # Error code: 14 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f88b30000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14960] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14963] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14964] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14964] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14963] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14960@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-vecs # Error code: 14 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faedbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14944] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14950] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14951] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14950] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14951] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14944@1,0] # Exit code: 14 # -------------------------------------------------------------------------- not ok sys_classes_bv_tests-test14_1+bv_type-svec # Error code: 14 ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fad6ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14984] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14987] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14988] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14987] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14988] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14984@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-contiguous # Error code: 14 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15012] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fafdbb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15019] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15019] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15012@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test14_1+bv_type-mat # Error code: 14 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95822000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15016] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15023] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15024] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15023] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15024] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15016@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test15_2.counts not ok sys_classes_bv_tests-test15_1+bv_type-svec # Error code: 14 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbaadd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15070] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15069] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15070] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15069] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15047@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-mat # Error code: 14 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8d5ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15097] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15101] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15101] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15097@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test16_1.counts not ok sys_classes_bv_tests-test15_2+bv_type-vecs # Error code: 14 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbc391000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15076] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15076] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15073@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-contiguous # Error code: 14 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95803000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15146] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15149] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15150] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15149] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15150] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15146@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-vecs # Error code: 14 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9d1a9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15135] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15130] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15135] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15136] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15130@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-contiguous # Error code: 14 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8b0bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15190] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15194] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15193] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15193] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15194] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15190@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-svec # Error code: 14 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f884da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15174] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15174] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15170@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-svec # Error code: 14 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb00aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15210] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15213] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15214] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15213] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15214] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15210@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-mat # Error code: 14 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15230] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f98ffc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15233] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15233] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no*** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15230@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test17_1.counts not ok sys_classes_bv_tests-test16_1+bv_type-mat # Error code: 14 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15250] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faa228000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15261] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15263] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15263] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15261] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15250@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test17_2.counts not ok sys_classes_bv_tests-test17_1+bv_type-vecs # Error code: 14 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15279] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f91411000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15282] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15282] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-vecs # Error code: 14 not ok sys_classes_bv_tests-test17_1+bv_type-contiguous # Error code: 14 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15313] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb14ac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15328] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f97432000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15331] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15331] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-contiguous # Error code: 14 not ok sys_classes_bv_tests-test17_1+bv_type-svec # Error code: 14 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15358] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9a369000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fae360000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test17_1+bv_type-mat # Error code: 14 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15392] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f88602000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15398] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15398] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15393] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f84243000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15399] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15399] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test18_1.counts not ok sys_classes_bv_tests-test17_2+bv_type-mat # Error code: 14 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15426] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f816c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15443] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15443] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-vecs_nsize-1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test19_1.counts # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0c0e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15446] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15446] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15487] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faf042000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f86f97000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15512] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15516] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15515] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15515] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15516] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15512@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-vecs_nsize-2 # Error code: 14 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15481] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9135000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15490] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15491] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15491] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15490] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15481@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:15481] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-mat_nsize-1 # Error code: 14 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fabba1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15547] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15547] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-contiguous_nsize-1 # Error code: 14 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15544] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8fc61000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15550] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15550] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-mat_nsize-2 # Error code: 14 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15568] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb522b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15585] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15585] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15568@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15606] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb868c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15609] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15609] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-contiguous_nsize-2 # Error code: 14 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8d27b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15578] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15584] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15586] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15584] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15586] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15578@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8ab30000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15642] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15642] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9921d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15660] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15664] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15663] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15664] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15663] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15660@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15623] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8379a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15637] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15638] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15638] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15637] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15623@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-mat_nsize-1 # Error code: 14 not ok sys_classes_bv_tests-test18_1+bv_type-mat_nsize-1 # Error code: 14 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f90610000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb5a83000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15698] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15698] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-mat_nsize-2 # Error code: 14 not ok sys_classes_bv_tests-test18_1+bv_type-mat_nsize-2 # Error code: 14 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc196000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15726] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15732] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15733] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15732] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15733] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15726@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb85b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15734] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15731] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15731] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15734] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15725@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_2.counts not ok sys_classes_bv_tests-test2_2+bv_type-vecs # Error code: 14 not ok sys_classes_bv_tests-test2_1+bv_type-vecs # Error code: 14 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb86f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15797] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15797] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15791] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9d45f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_1+bv_type-contiguous # Error code: 14 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbbcf6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15831] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15831] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_2+bv_type-contiguous # Error code: 14 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15825] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbde75000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_1+bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test2_2+bv_type-svec # Error code: 14 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15854] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb8399000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15866] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15866] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15860] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbba50000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15865] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15865] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_1+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test2_2+bv_type-mat # Error code: 14 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15892] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa2764000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15894] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbe805000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15900] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15900] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_3.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_1.counts not ok sys_classes_bv_tests-test2_3+bv_type-vecs # Error code: 14 not ok sys_classes_bv_tests-test3_1+bv_type-vecs # Error code: 14 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15953] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa68f5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15959] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15959] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f86363000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_1+bv_type-contiguous # Error code: 14 not ok sys_classes_bv_tests-test2_3+bv_type-contiguous # Error code: 14 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15987] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa94c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15994] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15994] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa5ecf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_3+bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test3_1+bv_type-svec # Error code: 14 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16021] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8535f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16028] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16028] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16022] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8862b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16027] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16027] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_3+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test3_1+bv_type-mat # Error code: 14 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16055] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2bcf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f812de000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_1_svec_vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_2.counts not ok sys_classes_bv_tests-test3_1_svec_vecs # Error code: 14 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16115] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8a00e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16122] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16122] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test3_1_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_3.counts not ok sys_classes_bv_tests-test3_2+bv_type-vecs # Error code: 14 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8b38e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16116] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16121] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16116@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-vecs # Error code: 14 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16154] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f886e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16158] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16157] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16158] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16157] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16154@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_2+bv_type-contiguous # Error code: 14 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8cb1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16178] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16177] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16178] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16177] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16174@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-contiguous # Error code: 14 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbf022000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16194] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16197] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16198] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16197] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16198] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16194@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-svec # Error code: 14 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa617d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16237] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16238] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16238] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16237] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16234@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_2+bv_type-svec # Error code: 14 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2b4f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16214] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16218] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16217] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16217] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16218] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16214@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-mat # Error code: 14 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8d619000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16254] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16258] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16257] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16257] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16258] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16254@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_1.counts not ok sys_classes_bv_tests-test4_1+bv_type-vecs # Error code: 14 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f7f80f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_2+bv_type-mat # Error code: 14 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16274] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9a5bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16277] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16278] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16277] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16278] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16274@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_1_vecs_vmip.counts not ok sys_classes_bv_tests-test4_1+bv_type-contiguous # Error code: 14 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16324] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83cf6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16327] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16327] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_1_vecs_vmip # Error code: 14 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16352] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa835a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16355] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16355] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1_vecs_vmip # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_2.counts not ok sys_classes_bv_tests-test4_1+bv_type-svec # Error code: 14 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16369] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f962e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-vecs # Error code: 14 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fad382000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16406] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16406] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_1+bv_type-mat # Error code: 14 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fad9c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16419] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16419] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test5_1.counts not ok sys_classes_bv_tests-test4_2+bv_type-contiguous # Error code: 14 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3facc78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16440] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16440] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-vecs # Error code: 14 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16463] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa5134000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16466] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-svec # Error code: 14 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9d914000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16489] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16489] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-contiguous # Error code: 14 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16497] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbca78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16500] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16500] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-mat # Error code: 14 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16516] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9dd9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16529] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16529] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test5_2.counts not ok sys_classes_bv_tests-test5_1+bv_type-svec # Error code: 14 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16531] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95c69000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test5_2+bv_type-vecs # Error code: 14 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16573] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faff55000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16580] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16575] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f92f27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16581] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16581] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_1.counts not ok sys_classes_bv_tests-test5_2+bv_type-contiguous # Error code: 14 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16610] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9d8b3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16625] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16625] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-vecs # Error code: 14 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9a270000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16628] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16628] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_2+bv_type-svec # Error code: 14 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16643] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f80def000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16653] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16653] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-contiguous # Error code: 14 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16659] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb4501000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16662] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16662] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test5_2+bv_type-mat # Error code: 14 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16678] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa59e1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16690] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb3fe6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16696] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16696] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_2.counts not ok sys_classes_bv_tests-test6_1+bv_type-mat # Error code: 14 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16723] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faee25000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-vecs # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_3.counts # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb2abf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16743] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16744] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16743] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16737@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-vecs # Error code: 14 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16787] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb1f55000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16794] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16794] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-contiguous # Error code: 14 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16812] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f97c7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16815] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16815] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-svec # Error code: 14 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb4db8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-contiguous # Error code: 14 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16780] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9f5f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16790] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16791] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16790] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16791] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16780@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-mat # Error code: 14 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16846] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa6c09000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16849] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16849] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_1.counts not ok sys_classes_bv_tests-test7_1+bv_type-vecs # Error code: 14 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16896] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faeb89000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-svec # Error code: 14 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f943d5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16864] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16865] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16864] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16865] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16861@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1+bv_type-contiguous # Error code: 14 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16913] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa0784000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16916] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16916] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-mat # Error code: 14 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb44b1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16931] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16932] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16932] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16931] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16928@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_1_mat.counts not ok sys_classes_bv_tests-test7_1+bv_type-svec # Error code: 14 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16946] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa1827000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16967] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16967] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-vecs # Error code: 14 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16978] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7bf2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16981] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16981] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1+bv_type-mat # Error code: 14 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16995] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e388000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17008] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17008] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_2.counts not ok sys_classes_bv_tests-test7_1_mat+bv_type-contiguous # Error code: 14 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17012] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb312b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17015] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17015] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-svec # Error code: 14 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba220000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17063] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17063] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-mat # Error code: 14 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17081] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa2959000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17084] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17084] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_2_mat.counts not ok sys_classes_bv_tests-test7_2+bv_type-vecs # Error code: 14 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fae313000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17059] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17059] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17047@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-vecs # Error code: 14 not ok sys_classes_bv_tests-test7_2+bv_type-contiguous # Error code: 14 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17113] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f84305000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17127] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17126] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17126] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17127] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17113@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5525000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17123] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17123@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-contiguous # Error code: 14 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fad3e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17161] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17167] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17166] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17167] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17166] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17161@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2+bv_type-svec # Error code: 14 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8fe9c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17171] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17163] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17170] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17170] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17171] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17163@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-svec # Error code: 14 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8ca30000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17195] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17208] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17210] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17208] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17195@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:17195] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2+bv_type-mat # Error code: 14 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f95a2f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17211] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17203] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17209] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17211] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17203@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:17203] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_3.counts not ok sys_classes_bv_tests-test7_2_mat+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test7_3+bv_type-vecs_bv_matmult-vecs # Error code: 14 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e11a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17264] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17263] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17264] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17263] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17256@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17239] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae26f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17259] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17260] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17259] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17260] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17239@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_1.counts not ok sys_classes_bv_tests-test8_1+bv_type-vecs # Error code: 14 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17313] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fafff8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17316] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17316] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-contiguous # Error code: 14 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17334] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb995f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17337] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17337] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-vecs_bv_matmult-mat # Error code: 14 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7bd5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17295@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:17295] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-svec # Error code: 14 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17351] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84483000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17366] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17366] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-contiguous_bv_matmult-vecs # Error code: 14 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8cded000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17363@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test7_3+bv_type-contiguous_bv_matmult-mat # Error code: 14 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17388] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9d5b1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17404] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17404] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa49d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17406] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17407] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17407] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17406] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17400@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_2.counts not ok sys_classes_bv_tests-test8_2+bv_type-vecs # Error code: 14 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17450] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f81218000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17457] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17457] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-contiguous # Error code: 14 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17475] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e476000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17478] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-svec_bv_matmult-vecs # Error code: 14 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17437] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8988b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17437@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-svec # Error code: 14 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17492] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbad27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17507] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17507] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-mat # Error code: 14 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f7fdb8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17532] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17532] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_3.counts not ok sys_classes_bv_tests-test7_3+bv_type-svec_bv_matmult-mat # Error code: 14 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faf2f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17510] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17506] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17511] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17510] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17506@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-vecs # Error code: 14 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fadb65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17562] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17562] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-contiguous # Error code: 14 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17594] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8ef86000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-mat_bv_matmult-vecs # Error code: 14 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb3df0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17577] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17574] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17578] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17577] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17578] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17574@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-svec # Error code: 14 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17613] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82943000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test7_3+bv_type-mat_bv_matmult-mat # Error code: 14 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f90778000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17642] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17641] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17642] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17641] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17630@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17646] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb336c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17649] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17649] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_1_svec_vecs.counts not ok sys_classes_bv_tests-test9_1+bv_type-vecs # Error code: 14 not ok sys_classes_bv_tests-test9_1_svec_vecs # Error code: 14 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17704] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fad731000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17711] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17711] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17705] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f86e7d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17710] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17710] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff ok sys_classes_bv_tests-test9_1_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_2.counts not ok sys_classes_bv_tests-test9_1+bv_type-contiguous # Error code: 14 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17738] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f87509000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17742] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17742] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_2+bv_type-vecs # Error code: 14 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17757] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9dd64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17773] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17772] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17772] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17773] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17757@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_1+bv_type-svec # Error code: 14 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17769] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa023b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17776] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17776] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_1+bv_type-mat # Error code: 14 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17810] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8e8c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17813] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17813] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_2_svec_vecs.counts not ok sys_classes_bv_tests-test9_2_svec_vecs # Error code: 14 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fba5b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17844] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17847] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17848] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17848] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17847] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17844@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_2_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test1_1.counts not ok sys_classes_bv_tests-test9_2+bv_type-contiguous # Error code: 14 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faee59000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17797] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17797] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17792@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # not ok sys_classes_ds_tests-test1_1 # Error code: 14 ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17877] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa979a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17880] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17880] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test1_2.counts not ok sys_classes_ds_tests-test1_2 # Error code: 14 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17919] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8dfd5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17926] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17926] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test12_1.counts not ok sys_classes_bv_tests-test9_2+bv_type-svec # Error code: 14 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faf498000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17922] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17922] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17904@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test12_1 # Error code: 14 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fac83b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test13_1.counts not ok sys_classes_bv_tests-test9_2+bv_type-mat # Error code: 14 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9ebae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18003] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18002] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18002] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:18003] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17982@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test13_2.counts not ok sys_classes_ds_tests-test13_1 # Error code: 14 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18000] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbea63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test15_1.counts not ok sys_classes_ds_tests-test13_2 # Error code: 14 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8dc8c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18065] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18065] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test13_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test16_1.counts not ok sys_classes_ds_tests-test15_1 # Error code: 14 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18063] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb1420000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18068] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18068] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test17_1.counts not ok sys_classes_ds_tests-test16_1 # Error code: 14 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb5d7e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18125] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18125] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test17_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test18_1.counts # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18122] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f88c53000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18128] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18128] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test18_2.counts not ok sys_classes_ds_tests-test18_1+nsize-1 # Error code: 14 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18176] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9180d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18185] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18185] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_2+nsize-1 # Error code: 14 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18182] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8cc5c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_1+nsize-2 # Error code: 14 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18205] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa7218000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18219] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18219] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18205@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_1+nsize-3 # Error code: 14 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbf190000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18249] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18247] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18248] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18247] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18248] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18249] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [warn] Epoll MOD(4) on fd 20 failed. Old events were 6; read change was 2 (del); write change was 0 (none); close change was 0 (none): Bad file descriptor # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18244@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test19_1.counts not ok sys_classes_ds_tests-test18_2+nsize-2 # Error code: 14 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f82f8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18223] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18224] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18224] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18223] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18216@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test19_1 # Error code: 14 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18280] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f82967000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18295] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18295] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test2_1.counts not ok sys_classes_ds_tests-test18_2+nsize-3 # Error code: 14 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f97612000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18298] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18299] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18292] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18300] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18299] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18298] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18300] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18292@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test2_2.counts not ok sys_classes_ds_tests-test2_1+ds_method-0 # Error code: 14 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18351] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faeab6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18361] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18361] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_2+ds_method-0 # Error code: 14 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18358] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1693000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_1+ds_method-1 # Error code: 14 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18380] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fac215000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18395] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18395] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_2+ds_method-1 # Error code: 14 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18392] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae68f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18398] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18398] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_1+ds_method-2 # Error code: 14 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18414] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9849b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_2+ds_method-2 # Error code: 14 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18426] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8070a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18431] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18431] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test20_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test21_1.counts not ok sys_classes_ds_tests-test20_1 # Error code: 14 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18480] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2d5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18489] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18489] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test20_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_1+nsize-1 # Error code: 14 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18486] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb86b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18492] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18492] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test21_2.counts not ok sys_classes_ds_tests-test21_1+nsize-2 # Error code: 14 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9f511000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18525] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18536] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18537] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18537] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18536] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18525@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_2+nsize-1 # Error code: 14 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18534] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb7984000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18540] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18540] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff # retrying sys_classes_ds_tests-test21_2+nsize-2 not ok sys_classes_ds_tests-test21_1+nsize-3 # Error code: 14 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fba671000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18578] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18576] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18578] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18558@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_1.counts not ok sys_classes_ds_tests-test22_1 # Error code: 14 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18617] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8045c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18620] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18620] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_2.counts not ok sys_classes_ds_tests-test22_2 # Error code: 14 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa5ced000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18650] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18650] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_2_extrarow.counts not ok sys_classes_ds_tests-test22_2_extrarow # Error code: 14 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18677] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb96f8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18680] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18680] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_2_extrarow # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test23_1.counts not ok sys_classes_ds_tests-test23_1 # Error code: 14 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18707] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f93130000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18710] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18710] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test23_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test24_1.counts not ok sys_classes_ds_tests-test24_1 # Error code: 14 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8c053000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test24_1_extrarow.counts not ok sys_classes_ds_tests-test24_1_extrarow # Error code: 14 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18767] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f616000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18770] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18770] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test24_1_extrarow # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test25_1.counts not ok sys_classes_ds_tests-test25_1 # Error code: 14 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa1a4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test25_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_1.counts not ok sys_classes_ds_tests-test26_1 # Error code: 14 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18827] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f99c5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18830] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18830] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_2.counts not ok sys_classes_ds_tests-test26_2 # Error code: 14 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb6a0c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18860] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18860] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_3.counts not ok sys_classes_ds_tests-test26_3+reorthog-0 # Error code: 14 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18887] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e0ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18890] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18890] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_3 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test26_3+reorthog-1 # Error code: 14 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f88bd2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18907] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18907] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test27_1.counts not ok sys_classes_ds_tests-test27_1 # Error code: 14 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f92ed3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18937] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18937] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test27_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test3_1.counts not ok sys_classes_ds_tests-test3_1+ds_method-0 # Error code: 14 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18964] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c2d4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18967] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18967] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_1+ds_method-1 # Error code: 14 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18981] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa00b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_1+ds_method-2 # Error code: 14 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8bbc7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19010] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19010] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test3_2.counts not ok sys_classes_ds_tests-test3_2+ds_method-0 # Error code: 14 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbb6de000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19040] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19040] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_2+nsize-2 # Error code: 14 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9a287000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18988] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18989] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18988] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18989] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18985@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_2+ds_method-1 # Error code: 14 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faab87000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19057] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19057] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_2+nsize-3 # Error code: 14 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f93a71000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19086] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19071] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:19088] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:19087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19086] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19088] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-19071@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_2+ds_method-2 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test4_1.counts # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19084] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab270000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19091] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19091] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test5_1.counts not ok sys_classes_ds_tests-test4_1 # Error code: 14 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19142] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbf5a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19152] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19152] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test5_2.counts not ok sys_classes_ds_tests-test5_1+ds_method-0 # Error code: 14 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa8b65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19155] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19155] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test5_1+ds_method-1 # Error code: 14 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8920d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19199] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19199] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test5_2+ds_method-0 # Error code: 14 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19196] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbf47b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19202] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19202] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test5_1+ds_method-2 # Error code: 14 not ok sys_classes_ds_tests-test5_2+ds_method-1 # Error code: 14 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19223] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9a9c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19236] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19236] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19230] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbea6f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test6_1.counts not ok sys_classes_ds_tests-test6_1+ds_method-0 # Error code: 14 not ok sys_classes_ds_tests-test5_2+ds_method-2 # Error code: 14 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19277] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9ac3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19283] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19283] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19264] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8ece1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19280] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19280] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test6_2.counts not ok sys_classes_ds_tests-test6_1+ds_method-1 # Error code: 14 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19310] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82218000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19325] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19325] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_2+ds_method-0 # Error code: 14 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19327] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f91ed2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_1+ds_method-2 # Error code: 14 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19346] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faa57c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19361] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19361] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test7_1.counts not ok sys_classes_ds_tests-test6_2+ds_method-1 # Error code: 14 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19358] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8e933000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_2+ds_method-2 # Error code: 14 not ok sys_classes_ds_tests-test7_1+ds_method-0 # Error code: 14 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8ebf4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19408] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19408] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa6a76000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19411] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19411] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test7_2.counts not ok sys_classes_ds_tests-test7_1+ds_method-1 # Error code: 14 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19440] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbf65c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_1.counts not ok sys_classes_ds_tests-test7_2+ds_method-0 # Error code: 14 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbc784000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19458] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19458] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test7_2+ds_method-1 # Error code: 14 not ok sys_classes_ds_tests-test8_1+ds_method-0 # Error code: 14 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb36e1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19505] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19505] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7e19000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19503] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19503] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test8_1 # SKIP Command failed so no diff ok sys_classes_ds_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_2.counts not ok sys_classes_ds_tests-test8_1+ds_method-1 # Error code: 14 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa67c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19547] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19547] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_3.counts not ok sys_classes_ds_tests-test8_2+ds_method-0 # Error code: 14 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19549] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f85037000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19552] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19552] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test8_2+ds_method-1 # Error code: 14 not ok sys_classes_ds_tests-test8_3+ds_method-0 # Error code: 14 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19591] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb4ad7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19593] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb29bc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19598] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19598] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test8_3 # SKIP Command failed so no diff ok sys_classes_ds_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test9_1.counts not ok sys_classes_ds_tests-test8_3+ds_method-1 # Error code: 14 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19626] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f95ce5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19641] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19641] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test1_1.counts not ok sys_classes_ds_tests-test9_1 # Error code: 14 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19643] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9cc29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19646] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19646] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test10_1.counts not ok sys_classes_fn_tests-test1_1 # Error code: 14 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19689] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f966d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test11_1.counts not ok sys_classes_fn_tests-test10_1 # Error code: 14 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19702] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f93fb7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19706] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19706] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test11_2.counts not ok sys_classes_fn_tests-test11_1 # Error code: 14 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19752] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb29a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19763] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19763] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test11_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test11_2 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test12_1.counts # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8ec0a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19766] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19766] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test12_1_rational.counts not ok sys_classes_fn_tests-test12_1+fn_type-exp # Error code: 14 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19813] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa8fb1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19823] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19823] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test12_1_rational # Error code: 14 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19820] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f994da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19826] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19826] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test12_1_rational # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test13_1.counts not ok sys_classes_fn_tests-test12_1+fn_type-sqrt # Error code: 14 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19842] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9a062000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19863] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19863] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test13_1_triang.counts not ok sys_classes_fn_tests-test13_1 # Error code: 14 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19870] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f97d92000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19873] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19873] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test2_1.counts not ok sys_classes_fn_tests-test13_1_triang # Error code: 14 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19903] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f960b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test13_1_triang # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_1.counts not ok sys_classes_fn_tests-test2_1 # Error code: 14 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19930] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa107b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19933] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19933] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_1_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_1+fn_method-0 # Error code: 14 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19967] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fbcbf8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_1_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f95781000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_1_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_1+fn_method-1 # Error code: 14 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb6c04000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20013] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20013] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_2.counts not ok sys_classes_fn_tests-test3_1_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20024] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3facc51000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20027] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20027] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_1_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_2_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_2+fn_method-0 # Error code: 14 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20060] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa7c4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_2_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20084] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8e9e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20087] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_2_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_2+fn_method-1 # Error code: 14 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20103] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9186b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20109] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20109] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_3.counts not ok sys_classes_fn_tests-test3_2_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9d50f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20121] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20121] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_2_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_3_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_3+fn_method-0 # Error code: 14 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20155] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9349c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_3_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20178] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9e891000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20181] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20181] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_3_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_3+fn_method-1 # Error code: 14 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20197] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb5170000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20203] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20203] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_4.counts not ok sys_classes_fn_tests-test3_3_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20212] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f85d39000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20215] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20215] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_3_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_4_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_4+fn_method-0 # Error code: 14 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7062000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20270] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20270] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_4 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_4_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20272] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d292000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20275] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20275] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_4_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_4+fn_method-1 # Error code: 14 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20291] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa6189000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20306] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20306] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_5.counts not ok sys_classes_fn_tests-test3_4_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20305] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1e45000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_4_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_6.counts not ok sys_classes_fn_tests-test3_5+fn_method-2 # Error code: 14 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20355] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d482000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20366] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20366] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_5 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_6+fn_method-2 # Error code: 14 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f82269000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_6 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_5+fn_method-3 # Error code: 14 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20385] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9908d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20400] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20400] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_5 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_6+fn_method-3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test4_1.counts # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20397] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8ae60000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20403] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20403] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test5_1.counts not ok sys_classes_fn_tests-test4_1 # Error code: 14 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20450] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f93e1b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20460] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20460] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test5_1 # Error code: 14 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20457] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8f3fd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20463] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20463] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test5_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test6_1.counts not ok sys_classes_fn_tests-test5_2 # Error code: 14 not ok sys_classes_fn_tests-test6_1 # Error code: 14 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f815ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20521] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20521] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20517] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa25cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20523] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20523] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test6_1 # SKIP Command failed so no diff ok sys_classes_fn_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test6_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_1.counts not ok sys_classes_fn_tests-test6_2 # Error code: 14 not ok sys_classes_fn_tests-test7_1+fn_method-0 # Error code: 14 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9949000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f84258000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20582] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20582] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff ok sys_classes_fn_tests-test6_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_1_sadeghi.counts not ok sys_classes_fn_tests-test7_1+fn_method-1 # Error code: 14 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20610] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f884d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20625] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20625] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7_1_sadeghi # Error code: 14 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20627] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb8be9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20630] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20630] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7_1_sadeghi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_2.counts not ok sys_classes_fn_tests-test7_1+fn_method-2 # Error code: 14 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20646] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa21c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20672] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20672] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_2_sadeghi.counts not ok sys_classes_fn_tests-test7_2+fn_method-0 # Error code: 14 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20674] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f958d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20677] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20677] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7_2_sadeghi # Error code: 14 not ok sys_classes_fn_tests-test7_2+fn_method-1 # Error code: 14 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa329e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20723] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20723] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7_2_sadeghi # SKIP Command failed so no diff # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fac409000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_3.counts not ok sys_classes_fn_tests-test7_2+fn_method-2 # Error code: 14 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20752] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faddb7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20768] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20768] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_3_inplace.counts not ok sys_classes_fn_tests-test7_3 # Error code: 14 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f96114000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20771] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20772] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20773] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20772] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20771] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20773] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-20765@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_fn_tests-test7_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test8_1.counts not ok sys_classes_fn_tests-test7_3_inplace # Error code: 14 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9802e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20834] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20835] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20834] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20835] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-20824@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_fn_tests-test7_3_inplace # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_1+fn_method-0 # Error code: 14 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20831] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbad0f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20839] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20839] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test8_2.counts not ok sys_classes_fn_tests-test8_1+fn_method-1 # Error code: 14 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20874] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9f76c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20885] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20885] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_2+fn_method-0 # Error code: 14 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbb314000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20888] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20888] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_1+fn_method-2 # Error code: 14 not ok sys_classes_fn_tests-test8_2+fn_method-1 # Error code: 14 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbeef9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20922] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20922] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa25b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20921] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20921] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_2+fn_method-2 # Error code: 14 not ok sys_classes_fn_tests-test8_1+fn_method-3 # Error code: 14 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20950] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa51e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20956] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20956] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20949] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb4561000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20955] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20955] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test9_1.counts not ok sys_classes_fn_tests-test8_2+fn_method-3 # Error code: 14 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9856d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21000] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21000] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1_1.counts not ok sys_classes_fn_tests-test9_1 # Error code: 14 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20997] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb60af000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21003] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21003] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1_2.counts not ok sys_classes_rg_tests-test1_1 # Error code: 14 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21050] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9d983000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test1_1 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test1_2 # Error code: 14 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21057] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba8fa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21063] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21063] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_2.counts not ok sys_classes_rg_tests-test2_1 # Error code: 14 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21111] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb59c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21120] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21120] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test2_2 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_3.counts # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21117] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8be36000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_4.counts not ok sys_classes_rg_tests-test2_3 # Error code: 14 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa0ca0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21180] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21180] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test2_3 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test2_4 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_5.counts # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21177] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f84d07000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21183] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21183] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test2_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1.counts not ok sys_classes_rg_tests-test2_5 # Error code: 14 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21230] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f93ffc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21240] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21240] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test2_5 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_ellipse.counts # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21237] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9fb39000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21243] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21243] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_interval.counts not ok sys_classes_rg_tests-test3_1_ellipse # Error code: 14 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21290] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9fc11000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21300] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21300] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1_ellipse # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_1_interval # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_ring.counts # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21297] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9bf85000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21303] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21303] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test3_1_interval # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_2.counts not ok sys_classes_rg_tests-test3_1_ring # Error code: 14 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21350] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8371b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21360] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21360] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1_ring # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_2 # Error code: 14 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21357] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa64dd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21363] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21363] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_2_interval.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_ellipse.counts not ok sys_classes_rg_tests-test3_2_interval # Error code: 14 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21411] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8cedf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21420] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21420] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_2_interval # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_4_ellipse # Error code: 14 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21417] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f89364000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21423] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21423] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_4_ellipse # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_interval.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_ring.counts not ok sys_classes_rg_tests-test3_4_interval # Error code: 14 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21472] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb544d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21480] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21480] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_4_interval # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_4_ring # Error code: 14 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21477] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa5a78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21483] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21483] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test3_4_ring # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test1_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test2_1.counts not ok sys_classes_st_tests-test1_1+st_matmode-inplace # Error code: 14 not ok sys_classes_st_tests-test2_1+st_matmode-copy # Error code: 14 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21534] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faae74000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21541] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21541] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test1_1 # SKIP Command failed so no diff # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21537] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5c8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21543] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21543] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test1_1+st_matmode-shell # Error code: 14 not ok sys_classes_st_tests-test2_1+st_matmode-inplace # Error code: 14 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21571] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f96afb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21577] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21577] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f96166000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21576] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test3_1.counts not ok sys_classes_st_tests-test2_1+st_matmode-shell # Error code: 14 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f82b5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21621] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21621] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test3_1+st_matmode-copy # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test4_1.counts # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21618] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9540000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21624] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21624] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test3_1+st_matmode-inplace # Error code: 14 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21659] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8fb89000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test4_1+st_matmode-copy # Error code: 14 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21665] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3faee2b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21671] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21671] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test4_1+st_matmode-shell # Error code: 14 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21699] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f96da5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21704] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21704] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! not ok sys_classes_st_tests-test3_1+st_matmode-shell # Error code: 14 ok sys_classes_st_tests-test4_1 # SKIP Command failed so no diff # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21695] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f92c41000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21705] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21705] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test4_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test5_1.counts not ok sys_classes_st_tests-test4_2+st_matmode-copy # Error code: 14 not ok sys_classes_st_tests-test5_1+st_matmode-copy # Error code: 14 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21759] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f93476000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21765] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21765] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21757] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f859cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21764] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21764] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test5_1 # SKIP Command failed so no diff ok sys_classes_st_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_st_tests-test4_2+st_matmode-shell # Error code: 14 not ok sys_classes_st_tests-test5_1+st_matmode-inplace # Error code: 14 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e2b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21799] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21799] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21793] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7fdd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test4_2 # SKIP Command failed so no diff ok sys_classes_st_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test5_1_shell.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-copy.counts not ok sys_classes_st_tests-test6_1_st_matmode-copy # Error code: 14 not ok sys_classes_st_tests-test5_1_shell # Error code: 14 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21853] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8a402000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21858] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21858] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test6_1_st_matmode-copy # SKIP Command failed so no diff # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21852] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95b12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21859] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21859] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test5_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-inplace.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-shell.counts not ok sys_classes_st_tests-test6_1_st_matmode-inplace # Error code: 14 not ok sys_classes_st_tests-test6_1_st_matmode-shell # Error code: 14 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21912] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f87ccf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21918] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21913] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb3e96000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21919] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21919] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test6_1_st_matmode-shell # SKIP Command failed so no diff ok sys_classes_st_tests-test6_1_st_matmode-inplace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-cayley.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-shift.counts not ok sys_classes_st_tests-test7_1_st_type-shift # Error code: 14 not ok sys_classes_st_tests-test7_1_st_type-cayley # Error code: 14 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21971] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f94b21000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21978] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21978] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test7_1_st_type-cayley # SKIP Command failed so no diff # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21973] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbf00a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21979] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21979] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test7_1_st_type-shift # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-sinvert.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-cayley.counts not ok sys_classes_st_tests-test7_1_st_type-sinvert # Error code: 14 not ok sys_classes_st_tests-test8_1_st_type-cayley # Error code: 14 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22031] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2eea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22033] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa7db8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22039] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22039] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test7_1_st_type-sinvert # SKIP Command failed so no diff ok sys_classes_st_tests-test8_1_st_type-cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-shift.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-sinvert.counts not ok sys_classes_st_tests-test8_1_st_type-sinvert # Error code: 14 not ok sys_classes_st_tests-test8_1_st_type-shift # Error code: 14 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22093] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fadb35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22099] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22099] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test8_1_st_type-sinvert # SKIP Command failed so no diff # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22092] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faa43e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22098] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22098] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test8_1_st_type-shift # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test9_1_st_type-shift.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test9_1_st_type-sinvert.counts not ok sys_classes_st_tests-test9_1_st_type-sinvert # Error code: 14 not ok sys_classes_st_tests-test9_1_st_type-shift # Error code: 14 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22152] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f921ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22158] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22158] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22153] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb9de4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22159] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22159] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test9_1_st_type-sinvert # SKIP Command failed so no diff ok sys_classes_st_tests-test9_1_st_type-shift # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_mat_tests-test1_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_mat_tests-test1_2.counts not ok sys_mat_tests-test1_1 # Error code: 14 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22213] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8dd0d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22218] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22218] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_mat_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test1_1.counts not ok sys_tests-test1_1 # Error code: 14 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22251] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa7a94000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22254] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22254] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test3_arpack.counts not ok sys_mat_tests-test1_2 # Error code: 14 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbed08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22212] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22219] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22219] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22212@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_mat_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test3_no-primme.counts not ok sys_tests-test3_arpack # Error code: 14 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22296] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa659c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tests-test3_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test4_1.counts not ok sys_tests-test3_no-primme # Error code: 14 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22308] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9dc70000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22312] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22312] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tests-test3_no-primme # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test4_2.counts not ok sys_tests-test4_1 # Error code: 14 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22358] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f98abb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tutorials-ex33_1.counts not ok sys_tests-test4_2 # Error code: 14 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22367] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbe5b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_vec_tests-test1_1.counts not ok sys_tutorials-ex33_1 # Error code: 14 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84b28000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22429] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22429] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tutorials-ex33_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_vec_tests-test1_2.counts not ok sys_vec_tests-test1_1 # Error code: 14 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22426] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f928c4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_vec_tests-test1_1 # SKIP Command failed so no diff RM test-rm-sys.F90 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1.counts not ok sys_vec_tests-test1_2 # Error code: 14 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22479] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb36f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22490] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22491] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22491] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22490] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22479@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_vec_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_subspace.counts not ok eps_tests-test1_1+eps_type-krylovschur # Error code: 14 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22489] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8e5fb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1+eps_type-arnoldi # Error code: 14 not ok eps_tests-test1_1_subspace # Error code: 14 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22533] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f84b51000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22540] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22540] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22537] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa4341000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22543] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22543] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_nopurify.counts not ok eps_tests-test1_1+eps_type-gd # Error code: 14 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb0515000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22587] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22587] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1_ks_nopurify # Error code: 14 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22584] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa069c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_ks_nopurify # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_trueres.counts not ok eps_tests-test1_1+eps_type-jd # Error code: 14 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22606] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8cc13000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22627] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22627] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1_ks_trueres # Error code: 14 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22634] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d092000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22637] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22637] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_trueres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_sinvert.counts not ok eps_tests-test1_1+eps_type-lapack # Error code: 14 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22651] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbd12e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22661] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22661] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_cayley.counts not ok eps_tests-test1_1_ks_sinvert # Error code: 14 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22681] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8c88c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22684] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22684] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_ks_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_lanczos.counts not ok eps_tests-test1_1_ks_cayley # Error code: 14 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22713] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9dc4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_gd2.counts not ok eps_tests-test1_1_lanczos # Error code: 14 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c589000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22744] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_gd_borth.counts not ok eps_tests-test1_1_gd2 # Error code: 14 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22773] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbf4d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22782] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22782] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_jd_borth.counts not ok eps_tests-test1_1_gd_borth # Error code: 14 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22801] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb7846000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_gd_borth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_lobpcg.counts not ok eps_tests-test1_1_jd_borth # Error code: 14 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22833] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f81f53000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22842] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22842] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_jd_borth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_cholesky.counts not ok eps_tests-test1_1_lobpcg # Error code: 14 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f19b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22864] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22864] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_scalapack.counts not ok eps_tests-test1_1_cholesky # Error code: 14 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22893] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f99126000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22901] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22901] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_cholesky # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss.counts not ok eps_tests-test1_1_scalapack+nsize-1 # Error code: 14 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa8797000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff not ok eps_tests-test1_1_ciss+eps_ciss_extraction-ritz # Error code: 14 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22953] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95199000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss # SKIP Command failed so no diff not ok eps_tests-test1_1_ciss+eps_ciss_extraction-hankel # Error code: 14 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9ced0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_ksps.counts not ok eps_tests-test1_1_scalapack+nsize-2 # Error code: 14 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa240b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22968] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22972] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22972] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22968@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff not ok eps_tests-test1_1_ciss_ksps # Error code: 14 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23020] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fabff5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23023] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23023] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss_ksps # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_gnhep.counts not ok eps_tests-test1_1_ciss_gnhep # Error code: 14 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8666a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23076] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23076] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_trapezoidal.counts not ok eps_tests-test1_1_scalapack+nsize-3 # Error code: 14 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa23d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23039] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23040] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23039] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23040] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23035@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2.counts not ok eps_tests-test1_1_ciss_trapezoidal # Error code: 14 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23106] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8a8d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss_trapezoidal # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2_open.counts not ok eps_tests-test1_2 # Error code: 14 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23131] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9064b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23134] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23134] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2_parallel.counts not ok eps_tests-test1_2_open # Error code: 14 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb4395000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23184] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23184] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_2_open # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_3.counts not ok eps_tests-test1_2_parallel # Error code: 14 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f81957000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23194] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23195] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23196] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23196] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23194] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23195] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23191@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_2_parallel # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_4.counts not ok eps_tests-test1_3 # Error code: 14 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23231] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f7fb64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23250] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23250] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_5_rqcg.counts not ok eps_tests-test1_4+eps_power_shift_type-constant # Error code: 14 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23257] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9c908000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23260] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23260] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_4 # SKIP Command failed so no diff not ok eps_tests-test1_5_rqcg # Error code: 14 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa66da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23304] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_5_rqcg # SKIP Command failed so no diff not ok eps_tests-test1_4+eps_power_shift_type-rayleigh # Error code: 14 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23301] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f7ff62000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23307] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23307] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_5_lobpcg.counts not ok eps_tests-test1_4+eps_power_shift_type-wilkinson # Error code: 14 not ok eps_tests-test1_5_lobpcg # Error code: 14 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23341] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb68e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23348] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93f6d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23354] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23354] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_4 # SKIP Command failed so no diff ok eps_tests-test1_5_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6_lanczos.counts not ok eps_tests-test1_6_lanczos # Error code: 14 not ok eps_tests-test1_6+eps_type-krylovschur # Error code: 14 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23406] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbdac9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23414] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23414] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_6 # SKIP Command failed so no diff # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23408] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d1c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23413] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6_subspace.counts not ok eps_tests-test1_6+eps_type-arnoldi # Error code: 14 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23441] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f88b73000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23456] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23456] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6 # SKIP Command failed so no diff not ok eps_tests-test1_6_subspace # Error code: 14 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0554000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23461] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23461] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_ghep.counts not ok eps_tests-test1_6+eps_type-gd # Error code: 14 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23477] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f844e4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23486] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23486] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_gnhep.counts not ok eps_tests-test1_9_ks_gnhep # Error code: 14 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f80caa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23540] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23543] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23544] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23543] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:23544] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23540@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_9_ks_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_ghiep.counts not ok eps_tests-test1_9_ks_ghep # Error code: 14 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa3a63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23509] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23509] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23505@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_9_ks_ghep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_lobpcg_ghep.counts not ok eps_tests-test1_9_ks_ghiep # Error code: 14 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f99fa0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23573] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23577] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23576] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23573@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_9_ks_ghiep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_jd_gnhep.counts not ok eps_tests-test1_9_lobpcg_ghep # Error code: 14 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbf0a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23608] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23607] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23607] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23608] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23602@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_9_lobpcg_ghep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1.counts not ok eps_tests-test10_1+eps_type-krylovschur # Error code: 14 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23668] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f90842000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23673] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23673] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test1_9_jd_gnhep # Error code: 14 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa3f86000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23642] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23643] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23642] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23643] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23639@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_9_jd_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_lobpcg.counts not ok eps_tests-test10_1+eps_type-arnoldi # Error code: 14 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23689] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9de29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23692] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23692] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test10_1_lobpcg # Error code: 14 not ok eps_tests-test10_1+eps_type-gd # Error code: 14 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95a4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23735] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23735] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1_lobpcg # SKIP Command failed so no diff # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbbda4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23737] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23737] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_lanczos.counts not ok eps_tests-test10_1+eps_type-jd # Error code: 14 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23766] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8afde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23781] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23781] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test10_1_lanczos # Error code: 14 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23778] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f96caa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23784] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23784] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_gd2.counts not ok eps_tests-test10_1+eps_type-rqcg # Error code: 14 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23800] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fac59e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23825] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23825] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_1.counts not ok eps_tests-test10_1_gd2 # Error code: 14 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23828] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbcefa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23831] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23831] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_1_ks_cayley.counts not ok eps_tests-test11_1+eps_type-krylovschur # Error code: 14 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23868] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb733b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23885] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23885] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff not ok eps_tests-test11_1_ks_cayley # Error code: 14 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23888] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9c1c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23891] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23891] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1_ks_cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_2.counts not ok eps_tests-test11_1+eps_type-arnoldi # Error code: 14 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23907] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fabc9a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23921] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23921] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff not ok eps_tests-test11_2 # Error code: 14 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23935] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faa90e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23938] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23938] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1.counts not ok eps_tests-test11_1+eps_type-lapack # Error code: 14 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23952] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8e886000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23957] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23957] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1_gd.counts not ok eps_tests-test12_1+eps_type-krylovschur # Error code: 14 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0bfd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23986] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23986] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff not ok eps_tests-test12_1_gd+eps_type-gd # Error code: 14 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24012] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8b8e9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24017] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24017] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test12_1_gd # SKIP Command failed so no diff not ok eps_tests-test12_1+eps_type-subspace # Error code: 14 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24029] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faa3bc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24032] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24032] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff not ok eps_tests-test12_1_gd+eps_type-jd # Error code: 14 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24046] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f91643000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24051] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24051] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1_gd2.counts not ok eps_tests-test12_1+eps_type-arnoldi # Error code: 14 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24063] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbbc43000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24066] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24066] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test12_1 # SKIP Command failed so no diff not ok eps_tests-test12_1_gd2 # Error code: 14 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24095] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f95cc8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24104] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24104] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1.counts not ok eps_tests-test12_1+eps_type-power # Error code: 14 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbad66000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24113] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24113] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1_jd.counts not ok eps_tests-test13_1+eps_type-krylovschur # Error code: 14 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24152] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f98cff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24167] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24167] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_1 # SKIP Command failed so no diff not ok eps_tests-test13_1_jd # Error code: 14 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7a9f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1_gd2.counts not ok eps_tests-test13_1+eps_type-gd # Error code: 14 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f86403000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24204] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24204] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_2.counts not ok eps_tests-test13_1_gd2 # Error code: 14 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24219] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fade48000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24240] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24240] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_2_gd2.counts not ok eps_tests-test13_2+eps_type-krylovschur # Error code: 14 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24247] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb8fa9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24250] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24250] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_2 # SKIP Command failed so no diff not ok eps_tests-test13_2_gd2 # Error code: 14 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24280] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f93171000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24294] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24294] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_2_gd2 # SKIP Command failed so no diff not ok eps_tests-test13_2+eps_type-gd # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test14_1.counts # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24291] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa06fc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24297] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24297] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_2 # SKIP Command failed so no diff not ok eps_tests-test14_1 # Error code: 14 not ok eps_tests-test13_2+eps_type-jd # Error code: 14 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24338] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f92921000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24343] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24343] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24331] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f93e5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24344] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24344] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_2 # SKIP Command failed so no diff ok eps_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test16_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17_1.counts not ok eps_tests-test16_1 # Error code: 14 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24397] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4bcb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24403] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24403] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17_2.counts not ok eps_tests-test17_2 # Error code: 14 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24436] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f80be7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24439] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24439] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test17_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_1_ks.counts not ok eps_tests-test18_1_ks # Error code: 14 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24466] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9048e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24469] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24469] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_1_ks # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_1_ks_gnhep.counts not ok eps_tests-test17_1 # Error code: 14 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24398] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e2f8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24405] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24404] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24404] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24405] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24398@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_2_gd.counts not ok eps_tests-test18_1_ks_gnhep # Error code: 14 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8ee3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24517] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24517] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_1_ks_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_2_jd.counts not ok eps_tests-test18_2_gd # Error code: 14 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24524] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa8a38000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24527] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24527] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_2_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test19_1.counts not ok eps_tests-test18_2_jd # Error code: 14 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faa19c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24577] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:24577] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test18_2_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1.counts not ok eps_tests-test19_1 # Error code: 14 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24584] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa8327000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24587] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24587] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_gd2.counts not ok eps_tests-test2_1+eps_type-arnoldi # Error code: 14 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24618] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa17de000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24637] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24637] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_gd2 # Error code: 14 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24644] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fae168000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24647] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24647] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_krylovschur.counts not ok eps_tests-test2_1+eps_type-gd # Error code: 14 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24662] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb4d75000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_krylovschur+eps_krylovschur_locking-0 # Error code: 14 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24691] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbaf24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_1_krylovschur # SKIP Command failed so no diff not ok eps_tests-test2_1+eps_type-jd # Error code: 14 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24708] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f9ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_krylovschur+eps_krylovschur_locking-1 # Error code: 14 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9bc6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24728] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24728] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1_krylovschur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_scalapack.counts not ok eps_tests-test2_1+eps_type-lapack # Error code: 14 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24744] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbdace000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24765] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24765] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_2.counts not ok eps_tests-test2_1_scalapack # Error code: 14 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8cc8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_2_selective.counts not ok eps_tests-test2_2+eps_lanczos_reorthog-local # Error code: 14 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24804] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb6f95000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24825] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24825] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2 # SKIP Command failed so no diff not ok eps_tests-test2_2_selective # Error code: 14 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24832] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8d95d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24835] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24835] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_2_selective # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3.counts not ok eps_tests-test2_2+eps_lanczos_reorthog-full # Error code: 14 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24850] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa1b9f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24854] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24854] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_2 # SKIP Command failed so no diff not ok eps_tests-test2_3+eps_type-krylovschur # Error code: 14 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f899be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24879] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24883] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24882] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24882] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:24883] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24879@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test2_3 # SKIP Command failed so no diff not ok eps_tests-test2_2+eps_lanczos_reorthog-periodic # Error code: 14 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24897] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f98711000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24900] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24900] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2 # SKIP Command failed so no diff not ok eps_tests-test2_3+eps_type-lapack # Error code: 14 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f91c8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24920] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24919] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24920] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24919] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24916@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3_gd.counts not ok eps_tests-test2_2+eps_lanczos_reorthog-partial # Error code: 14 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fabeef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24941] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24941] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3_jd.counts not ok eps_tests-test2_3_jd # Error code: 14 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24997] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83bb9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25004] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25004] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_3_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test20_1.counts not ok eps_tests-test2_3_gd # Error code: 14 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24966] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa9e31000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24972] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24972] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24966@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:24966] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok eps_tests-test2_3_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test20_1_lanczos.counts not ok eps_tests-test20_1+eps_type-krylovschur # Error code: 14 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25031] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f98573000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25034] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25034] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test20_1_lanczos # Error code: 14 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25059] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faf3ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test21_1.counts not ok eps_tests-test20_1+eps_type-arnoldi # Error code: 14 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25076] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9cac0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test21_1 # Error code: 14 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82b02000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test21_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_1.counts not ok eps_tests-test20_1+eps_type-gd # Error code: 14 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25122] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbb7ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25126] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25126] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test20_1+eps_type-jd # Error code: 14 not ok eps_tests-test22_1+eps_true_residual-0 # Error code: 14 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25164] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb3946000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25172] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25172] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25167] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fab196000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff ok eps_tests-test22_1 # SKIP Command failed so no diff not ok eps_tests-test22_1+eps_true_residual-1 # Error code: 14 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25201] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbc249000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25204] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25204] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_1 # SKIP Command failed so no diff not ok eps_tests-test20_1+eps_type-rqcg # Error code: 14 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25200] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8e956000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25207] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25207] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_2.counts not ok eps_tests-test20_1+eps_type-lobpcg # Error code: 14 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25241] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f81d61000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25251] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25251] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test22_2 # Error code: 14 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fba9e6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25254] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25254] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test22_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_3.counts not ok eps_tests-test20_1+eps_type-lapack # Error code: 14 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25273] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa7e7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25295] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25295] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_1.counts not ok eps_tests-test22_3+bv_orthog_block-gs # Error code: 14 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25298] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb6374000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25301] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25301] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_1 # Error code: 14 not ok eps_tests-test22_3+bv_orthog_block-tsqr # Error code: 14 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25342] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa659c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25348] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25348] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25341] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f89e37000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25347] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25347] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test23_1 # SKIP Command failed so no diff ok eps_tests-test22_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_2.counts not ok eps_tests-test22_3+bv_orthog_block-chol # Error code: 14 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25376] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9a96f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25392] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25392] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_2 # Error code: 14 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25389] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8f714000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25395] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25395] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test23_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_3.counts not ok eps_tests-test22_3+bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25412] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa7fba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25436] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25436] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_3 # Error code: 14 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25439] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f94bee000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test23_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test24_1.counts not ok eps_tests-test22_3+bv_orthog_block-svqb # Error code: 14 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb8961000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test25_1.counts not ok eps_tests-test24_1 # Error code: 14 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25486] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9f490000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25489] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25489] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test26_1.counts not ok eps_tests-test25_1 # Error code: 14 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25518] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbea89000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test25_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test26_1_arpack.counts not ok eps_tests-test26_1+eps_true_residual-0_eps_two_sided-0 # Error code: 14 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25546] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f84a37000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25549] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25549] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test26_1_arpack # Error code: 14 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25578] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fba167000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25593] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25593] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test27_1.counts not ok eps_tests-test26_1+eps_true_residual-0_eps_two_sided-1 # Error code: 14 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25592] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83090000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25596] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25596] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test26_1+eps_true_residual-1_eps_two_sided-0 # Error code: 14 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9e999000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25640] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25640] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test27_1+eps_type-gd # Error code: 14 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb20ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25643] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25643] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test27_1+eps_type-jd # Error code: 14 not ok eps_tests-test26_1+eps_true_residual-1_eps_two_sided-1 # Error code: 14 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb57c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25677] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25677] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25663] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb88e1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25676] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25676] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test27_1 # SKIP Command failed so no diff ok eps_tests-test26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test27_2.counts not ok eps_tests-test27_1+eps_type-rqcg # Error code: 14 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25704] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f97d2e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test27_2+st_filter_type-filtlan # Error code: 14 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa48f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_2 # SKIP Command failed so no diff not ok eps_tests-test27_2+st_filter_type-chebyshev # Error code: 14 not ok eps_tests-test27_1+eps_type-lobpcg # Error code: 14 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25752] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa8f24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25758] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25758] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_2 # SKIP Command failed so no diff # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25740] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb6d5f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25756] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25756] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1_jd.counts not ok eps_tests-test28_1_jd # Error code: 14 not ok eps_tests-test28_1+eps_type-krylovschur # Error code: 14 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25812] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8459f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25817] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25817] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25811] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f955b7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25818] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25818] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_1 # SKIP Command failed so no diff ok eps_tests-test28_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1_lanczos.counts not ok eps_tests-test28_1+eps_type-arnoldi # Error code: 14 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25845] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa3d4c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25860] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25860] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_1_lanczos # Error code: 14 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25862] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3facdfa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25865] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25865] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_2.counts not ok eps_tests-test28_1+eps_type-gd # Error code: 14 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25881] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9643a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25905] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25905] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_2+eps_type-power # Error code: 14 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25909] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fad61d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25912] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25912] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_2 # SKIP Command failed so no diff not ok eps_tests-test28_1+eps_type-rqcg # Error code: 14 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f93408000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_2+eps_type-subspace # Error code: 14 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25942] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f90e32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_3.counts not ok eps_tests-test28_1+eps_type-lobpcg # Error code: 14 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25962] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbd54f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25983] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25983] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_3 # Error code: 14 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9bd3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1.counts not ok eps_tests-test28_1+eps_type-lapack # Error code: 14 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26009] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f915e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_cmplxvals.counts not ok eps_tests-test29_1 # Error code: 14 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9060000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26040] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26040] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test29_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_rqi.counts not ok eps_tests-test29_1_cmplxvals # Error code: 14 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26069] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f958f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26089] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26089] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test29_1_cmplxvals # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_rqi_singular.counts not ok eps_tests-test29_1_rqi # Error code: 14 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26097] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f95855000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test29_1_rqi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_3.counts not ok eps_tests-test29_1_rqi_singular # Error code: 14 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26129] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f98a79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26143] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26143] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test29_1_rqi_singular # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1.counts not ok eps_tests-test29_3 # Error code: 14 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26157] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb6fec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26160] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26160] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test29_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_lanczos.counts not ok eps_tests-test3_1+eps_type-krylovschur # Error code: 14 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9027000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26207] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26207] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_lanczos # Error code: 14 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26217] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b900000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_power.counts not ok eps_tests-test3_1+eps_type-subspace # Error code: 14 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f90a5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26239] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26239] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_power # Error code: 14 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26264] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9a369000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1_power # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_jd.counts not ok eps_tests-test3_1+eps_type-arnoldi # Error code: 14 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26281] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f806b3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26284] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26284] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_jd # Error code: 14 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26312] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9afca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26316] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26316] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_gd.counts not ok eps_tests-test3_1+eps_type-lapack # Error code: 14 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26328] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9ab29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26331] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26331] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_gd2.counts not ok eps_tests-test3_1_gd # Error code: 14 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb0e44000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26381] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26381] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_arpack.counts not ok eps_tests-test3_1_gd2 # Error code: 14 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26388] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb3f79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26391] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26391] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_scalapack.counts not ok eps_tests-test3_1_arpack # Error code: 14 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26423] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9159a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26441] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26441] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_rqcg.counts not ok eps_tests-test3_1_scalapack # Error code: 14 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26448] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8b75b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26451] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26451] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lobpcg.counts not ok eps_tests-test3_2_rqcg # Error code: 14 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26482] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa7e51000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_2_rqcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lanczos.counts not ok eps_tests-test3_2_lobpcg # Error code: 14 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26508] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbdb12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26511] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26511] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_2_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lanczos_delayed.counts not ok eps_tests-test3_2_lanczos # Error code: 14 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26541] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82b7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26561] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26561] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_2_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test30_1.counts not ok eps_tests-test3_2_lanczos_delayed # Error code: 14 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26568] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83653000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26571] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26571] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_2_lanczos_delayed # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_1.counts not ok eps_tests-test30_1 # Error code: 14 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26603] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a148000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26621] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26621] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test30_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_2.counts not ok eps_tests-test31_1 # Error code: 14 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e5c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26631] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26631] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_3.counts not ok eps_tests-test31_2 # Error code: 14 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26665] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbb4e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26682] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26682] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_4.counts not ok eps_tests-test31_3 # Error code: 14 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26688] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8a6f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26691] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26691] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_5.counts not ok eps_tests-test31_4+st_filter_damping-none # Error code: 14 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26724] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa3831000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26743] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26743] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_4 # SKIP Command failed so no diff not ok eps_tests-test31_5+st_filter_damping-lanczos # Error code: 14 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26748] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f932f7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26751] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26751] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_5 # SKIP Command failed so no diff not ok eps_tests-test31_4+st_filter_damping-jackson # Error code: 14 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26767] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8e307000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26774] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26774] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_1.counts not ok eps_tests-test31_5+st_filter_damping-fejer # Error code: 14 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26782] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8cb0e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26785] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26785] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_2.counts not ok eps_tests-test32_1 # Error code: 14 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83f8e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26839] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26839] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_3.counts not ok eps_tests-test32_2 # Error code: 14 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26842] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa4a65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26845] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26845] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_3_gnhep.counts not ok eps_tests-test32_3+nsize-1 # Error code: 14 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26880] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f86cf4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26898] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26898] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_3 # SKIP Command failed so no diff not ok eps_tests-test32_3_gnhep+nsize-1 # Error code: 14 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26902] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbb01f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26905] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26905] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_3_gnhep # SKIP Command failed so no diff not ok eps_tests-test32_3+nsize-4 # Error code: 14 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8c764000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26932] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26932] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26937] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26936] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26936] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26937] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test32_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_4.counts not ok eps_tests-test32_4+nsize-1 # Error code: 14 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26986] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9c50f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26989] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26989] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_4 # SKIP Command failed so no diff not ok eps_tests-test32_3_gnhep+nsize-4 # Error code: 14 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbeabe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26942] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26939] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26944] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26945] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26942] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26945] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26944] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-26939@1,3] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_3_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_4_gnhep.counts not ok eps_tests-test32_4+nsize-4 # Error code: 14 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8effd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27008] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27009] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27008] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27009] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:27007] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 ok eps_tests-test32_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_redundant.counts not ok eps_tests-test32_4_gnhep+nsize-1 # Error code: 14 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27042] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb2042000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27058] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27058] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test32_4_gnhep # SKIP Command failed so no diff not ok eps_tests-test32_5_redundant # Error code: 14 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9cf4e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27073] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27074] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27075] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27073] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27075] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27074] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27070@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_5_redundant # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_mumps.counts not ok eps_tests-test32_4_gnhep+nsize-4 # Error code: 14 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f848f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27099] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27098] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27101] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27099] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27098] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27101] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27090@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_4_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_superlu.counts not ok eps_tests-test32_5_superlu # Error code: 14 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9375c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27175] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27174] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27175] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27174] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27170@1,2] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_5_superlu # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test37_1.counts not ok eps_tests-test32_5_mumps # Error code: 14 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa0068000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27137] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27134] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27138] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27139] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27137] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27139] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27138] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27134@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_5_mumps # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test38_1.counts not ok eps_tests-test37_1 # Error code: 14 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27206] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f84f49000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27209] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27209] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test37_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test39_1.counts not ok eps_tests-test38_1 # Error code: 14 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fafc6f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27237] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27237] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test38_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test39_1_lanczos.counts not ok eps_tests-test39_1+eps_type-krylovschur # Error code: 14 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f80196000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27265] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27270] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27269] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27270] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27269] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27265@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test39_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-arnoldi # Error code: 14 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a08d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27315] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27318] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27319] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27319] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27318] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27315@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test39_1 # SKIP Command failed so no diff not ok eps_tests-test39_1_lanczos # Error code: 14 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f91f80000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27299] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27298] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27299] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27298] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27295@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test39_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_1.counts not ok eps_tests-test4_1+type-krylovschur # Error code: 14 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27372] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa805f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27375] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27375] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-lobpcg # Error code: 14 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f877f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27339] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27343] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27342] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27343] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27342] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:27339] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27339@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test39_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-subspace # Error code: 14 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27389] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9c035000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27392] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27392] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-arnoldi # Error code: 14 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27422] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9c972000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27429] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27429] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-lapack # Error code: 14 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa9723000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27407] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27404] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27408] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:27407] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27404@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test39_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_1_arpack.counts not ok eps_tests-test4_1+type-lanczos # Error code: 14 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27443] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8ad3e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27446] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27446] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_1_arpack # Error code: 14 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27475] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f970c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27488] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27488] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1_arpack # SKIP Command failed so no diff not ok eps_tests-test4_1+type-gd # Error code: 14 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb7f68000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27491] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27491] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_2.counts not ok eps_tests-test4_1+type-jd # Error code: 14 not ok eps_tests-test4_2+type-rqcg # Error code: 14 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fac929000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27538] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27538] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27525] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fad0f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_2 # SKIP Command failed so no diff ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_2+type-lobpcg # Error code: 14 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27565] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f94554000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27569] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27569] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_2 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-gd2 # Error code: 14 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27566] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f92826000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27572] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test40_1.counts not ok eps_tests-test4_1+type-lapack # Error code: 14 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f88593000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test44_1_real.counts not ok eps_tests-test40_1 # Error code: 14 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27613] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f81788000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27619] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27619] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test40_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1.counts ok eps_tests-test5_1 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_power.counts ok eps_tests-test5_1_power # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_jd.counts ok eps_tests-test5_1_jd # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_gd.counts ok eps_tests-test5_1_gd # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_gd2.counts ok eps_tests-test5_1_gd2 # SKIP Requires DATAFILESPATH not ok eps_tests-test44_1_real+eps_krylovschur_bse_type-shao # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_2_arpack.counts # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27666] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8fff1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27681] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27681] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test44_1_real # SKIP Command failed so no diff ok eps_tests-test5_2_arpack # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1.counts not ok eps_tests-test44_1_real+eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27749] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f897f0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27764] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27764] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test44_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_power.counts not ok eps_tests-test6_1+eps_type-krylovschur # Error code: 14 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbcc50000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27774] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27774] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test6_1 # SKIP Command failed so no diff not ok eps_tests-test6_1_power # Error code: 14 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27806] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8b630000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27818] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27818] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1_power # SKIP Command failed so no diff not ok eps_tests-test6_1+eps_type-subspace # Error code: 14 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27815] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f95716000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27821] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27821] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_gd2.counts not ok eps_tests-test6_1+eps_type-arnoldi # Error code: 14 not ok eps_tests-test6_1_gd2 # Error code: 14 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27855] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f88a65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27865] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27865] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27862] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9be38000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_arpack.counts not ok eps_tests-test6_1+eps_type-gd # Error code: 14 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27895] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9b5be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27912] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27912] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff not ok eps_tests-test6_1_arpack # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1.counts # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27909] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa0941000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27915] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27915] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_lanczos.counts not ok eps_tests-test8_1+eps_type-power # Error code: 14 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27962] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa1f6e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27972] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27972] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1 # SKIP Command failed so no diff not ok eps_tests-test8_1_lanczos # Error code: 14 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27969] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd903000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27975] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27975] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_lapack.counts not ok eps_tests-test8_1+eps_type-subspace # Error code: 14 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27995] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8412d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28012] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28012] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1 # SKIP Command failed so no diff not ok eps_tests-test8_1_lapack # Error code: 14 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f990c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28022] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28022] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_krylovschur_vecs.counts not ok eps_tests-test8_1+eps_type-arnoldi # Error code: 14 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fad775000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_jd.counts not ok eps_tests-test8_1_krylovschur_vecs # Error code: 14 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28066] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa894f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28069] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28069] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1_krylovschur_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_gd.counts not ok eps_tests-test8_1_jd # Error code: 14 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb6917000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28125] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28125] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_gd2.counts not ok eps_tests-test8_1_gd # Error code: 14 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28126] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb6d1d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28129] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28129] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2.counts not ok eps_tests-test8_1_gd2 # Error code: 14 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28171] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f823b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28186] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28186] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2_lanczos.counts not ok eps_tests-test8_2+eps_type-rqcg # Error code: 14 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f92a26000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28189] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28189] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_2 # SKIP Command failed so no diff not ok eps_tests-test8_2+eps_type-lobpcg # Error code: 14 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28225] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb1acd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28233] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28233] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_2 # SKIP Command failed so no diff not ok eps_tests-test8_2_lanczos # Error code: 14 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28230] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa6706000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28236] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28236] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_2_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2_arpack.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_rqcg.counts not ok eps_tests-test8_2_arpack # Error code: 14 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28285] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8631c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28293] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28293] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_2_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_lanczos.counts not ok eps_tests-test8_3_rqcg # Error code: 14 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28290] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa68be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28296] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28296] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_3_rqcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_lobpcg.counts not ok eps_tests-test8_3_lanczos # Error code: 14 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb512e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_3_lanczos # SKIP Command failed so no diff not ok eps_tests-test8_3_lobpcg # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1.counts # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28350] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faba4c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28356] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28356] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_3_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1_gd.counts not ok eps_tests-test9_1+eps_type-krylovschur # Error code: 14 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28403] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fba790000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28413] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1 # SKIP Command failed so no diff not ok eps_tests-test9_1_gd # Error code: 14 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9217e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28416] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28416] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1_gd2.counts not ok eps_tests-test9_1+eps_type-arnoldi # Error code: 14 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9f459000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1 # SKIP Command failed so no diff not ok eps_tests-test9_1_gd2 # Error code: 14 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28460] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb43e4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28463] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28463] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_2.counts not ok eps_tests-test9_1+eps_type-lapack # Error code: 14 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28478] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c418000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28495] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28495] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_3.counts not ok eps_tests-test9_2+eps_balance-none_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28507] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faeda0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28510] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28510] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-none_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28553] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fadfdc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28558] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28558] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-oneside_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c7c4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_3+bv_orthog_refine-never # Error code: 14 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9790000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:28554] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:28555] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28554] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28555] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-28539@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test9_3 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-oneside_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f913be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28612] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28612] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-twoside_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f88616000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28633] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28633] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_3+bv_orthog_refine-ifneeded # Error code: 14 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f85517000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28610] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:28611] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28610] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28611] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-28605@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test9_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_4.counts not ok eps_tests-test9_2+eps_balance-twoside_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8fb63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28652] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28652] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_5.counts not ok eps_tests-test9_4 # Error code: 14 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28675] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9ba42000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28678] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28678] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_5_arpack.counts not ok eps_tests-test9_5 # Error code: 14 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28706] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb90ec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28711] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28711] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6.counts not ok eps_tests-test9_5_arpack # Error code: 14 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28735] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f841f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28738] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28738] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_5_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_hankel.counts not ok eps_tests-test9_6 # Error code: 14 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28766] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9eb65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28770] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28770] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_cheby.counts not ok eps_tests-test9_6_hankel # Error code: 14 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28795] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8d82c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_hankel # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_hankel_cheby.counts not ok eps_tests-test9_6_cheby # Error code: 14 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f84fcc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_cheby # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_refine.counts not ok eps_tests-test9_6_hankel_cheby # Error code: 14 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28855] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbe3a1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28858] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28858] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_hankel_cheby # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_bcgs.counts not ok eps_tests-test9_6_refine # Error code: 14 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28886] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fad6ee000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28892] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28892] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_6_refine # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_cheby_interval.counts not ok eps_tests-test9_6_bcgs # Error code: 14 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28915] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82e5c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28918] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_bcgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_7_real.counts not ok eps_tests-test9_6_cheby_interval # Error code: 14 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28946] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f96161000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28952] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28952] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_6_cheby_interval # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_8.counts not ok eps_tests-test9_7_real # Error code: 14 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28975] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9f7c3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28978] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28978] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_7_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_sinvert.counts not ok eps_tests-test9_8 # Error code: 14 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29006] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9817c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29012] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29012] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_sinvert_twoside.counts not ok eps_tutorials-ex10_1_sinvert # Error code: 14 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9775e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_shell.counts not ok eps_tutorials-ex10_1_sinvert_twoside+set_ht-0 # Error code: 14 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29066] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f86159000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_sinvert_twoside # SKIP Command failed so no diff not ok eps_tutorials-ex10_1_shell # Error code: 14 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29095] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb2903000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29098] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29098] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex10_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_shell_twoside.counts not ok eps_tutorials-ex10_1_sinvert_twoside+set_ht-1 # Error code: 14 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29112] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9ecc4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29115] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29115] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex10_1_sinvert_twoside # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex11_1.counts not ok eps_tutorials-ex10_1_shell_twoside+set_ht-0 # Error code: 14 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9b1f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29147] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29147] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_shell_twoside # SKIP Command failed so no diff not ok eps_tutorials-ex11_1 # Error code: 14 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29172] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab516000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29175] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29175] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex11_2.counts not ok eps_tutorials-ex10_1_shell_twoside+set_ht-1 # Error code: 14 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8d781000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29192] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29192] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_shell_twoside # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex12_1.counts not ok eps_tutorials-ex11_2 # Error code: 14 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29219] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f414000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29224] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29224] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_1.counts not ok eps_tutorials-ex12_1 # Error code: 14 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29249] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd85d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29252] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29252] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_2.counts not ok eps_tutorials-ex13_1 # Error code: 14 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29279] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb27fb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29284] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29284] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_3.counts not ok eps_tutorials-ex13_2+st_matstructure-different # Error code: 14 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29309] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f83029000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29312] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29312] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_2 # SKIP Command failed so no diff not ok eps_tutorials-ex13_3 # Error code: 14 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb09e2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29344] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29344] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_4.counts not ok eps_tutorials-ex13_2+st_matstructure-subset # Error code: 14 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb7bea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29359] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29359] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_6.counts not ok eps_tutorials-ex13_4+eps_type-gd # Error code: 14 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29392] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8f930000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29410] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29410] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_4 # SKIP Command failed so no diff not ok eps_tutorials-ex13_4+eps_type-lobpcg # Error code: 14 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29438] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9adfd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29441] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29441] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_4 # SKIP Command failed so no diff not ok eps_tutorials-ex13_4+eps_type-rqcg # Error code: 14 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29455] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb7acd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29458] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29458] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex18_1.counts not ok eps_tutorials-ex13_6 # Error code: 14 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fadb79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:29420] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:29419] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29419] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29420] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-29416@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_1_krylovschur.counts not ok eps_tutorials-ex18_1 # Error code: 14 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f90eea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29488] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29488] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_1_lobpcg.counts not ok eps_tutorials-ex19_1_krylovschur # Error code: 14 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29515] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9881a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex19_1_krylovschur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_2.counts not ok eps_tutorials-ex19_1_lobpcg # Error code: 14 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29543] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9f413000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29546] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29546] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex19_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_1.counts not ok eps_tutorials-ex19_2 # Error code: 14 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29575] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9a15c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29595] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29595] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex19_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_2.counts not ok eps_tutorials-ex2_1 # Error code: 14 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29603] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb874e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29606] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29606] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_ciss_2.counts not ok eps_tutorials-ex2_2 # Error code: 14 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29636] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f96331000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29656] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29656] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_3.counts not ok eps_tutorials-ex2_3 # Error code: 14 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29698] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fadd7e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29701] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29701] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_4.counts not ok eps_tutorials-ex2_ciss_2 # Error code: 14 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f94788000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29666] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29666] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29663] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:29667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-29663@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex2_ciss_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_4_filter.counts not ok eps_tutorials-ex2_4 # Error code: 14 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29728] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbd114000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29731] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29731] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_5.counts not ok eps_tutorials-ex2_4_filter+eps_type-krylovschur_st_filter_type-filtlan # Error code: 14 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29756] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c085000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29760] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29760] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff not ok eps_tutorials-ex2_5 # Error code: 14 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29786] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f98ebc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29791] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29791] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6.counts not ok eps_tutorials-ex2_4_filter+eps_type-krylovschur_st_filter_type-chebyshev # Error code: 14 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29803] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8a309000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29806] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29806] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff not ok eps_tutorials-ex2_6 # Error code: 14 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29847] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f96e40000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29853] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29853] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_6 # SKIP Command failed so no diff not ok eps_tutorials-ex2_4_filter+eps_type-subspace_st_filter_type-filtlan # Error code: 14 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29842] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb3c27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29850] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29850] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6_rel_large.counts not ok eps_tutorials-ex2_6_rel_large # Error code: 14 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29897] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb0c27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29900] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29900] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_6_rel_large # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6_rel_small.counts not ok eps_tutorials-ex2_4_filter+eps_type-subspace_st_filter_type-chebyshev # Error code: 14 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29881] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f93ed0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29891] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29891] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1.counts not ok eps_tutorials-ex2_6_rel_small # Error code: 14 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29947] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9766f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29957] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29957] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_6_rel_small # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1_lobpcg.counts not ok eps_tutorials-ex24_1 # Error code: 14 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f86d18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1_gd.counts not ok eps_tutorials-ex24_1_lobpcg # Error code: 14 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30007] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb994e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30017] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30017] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex24_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex25_1_mumps.counts not ok eps_tutorials-ex24_1_gd # Error code: 14 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30014] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbb51e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex24_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex29_1.counts not ok eps_tutorials-ex25_1_mumps # Error code: 14 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab124000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex25_1_mumps # SKIP Command failed so no diff not ok eps_tutorials-ex29_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex3_1.counts # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30074] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9bd53000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30080] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30080] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex29_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex3_2.counts not ok eps_tutorials-ex3_1 # Error code: 14 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30127] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f81ffe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30137] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30137] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex3_1 # SKIP Command failed so no diff not ok eps_tutorials-ex3_2 # Error code: 14 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30134] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa0594000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30140] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30140] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex30_1.counts ok eps_tutorials-ex3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex31_1.counts not ok eps_tutorials-ex30_1 # Error code: 14 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30188] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faaac2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30203] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30203] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex30_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_1.counts not ok eps_tutorials-ex31_1 # Error code: 14 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30202] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbe551000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30206] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30206] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex31_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_2.counts not ok eps_tutorials-ex34_1 # Error code: 14 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30250] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fad71c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30263] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30263] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_3.counts not ok eps_tutorials-ex34_2+form_function_ab-0 # Error code: 14 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f83664000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30266] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30266] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_2 # SKIP Command failed so no diff not ok eps_tutorials-ex34_2+form_function_ab-1 # Error code: 14 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30302] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbb6aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_2 # SKIP Command failed so no diff not ok eps_tutorials-ex34_3 # Error code: 14 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f933d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30313] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30313] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_4.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_5.counts not ok eps_tutorials-ex34_5+form_function_ab-0 # Error code: 14 not ok eps_tutorials-ex34_4+form_function_ab-0 # Error code: 14 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb2566000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30367] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbf656000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30373] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30373] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_4 # SKIP Command failed so no diff ok eps_tutorials-ex34_5 # SKIP Command failed so no diff not ok eps_tutorials-ex34_4+form_function_ab-1 # Error code: 14 not ok eps_tutorials-ex34_5+form_function_ab-1 # Error code: 14 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd149000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30406] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30406] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbe8ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30407] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30407] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_4 # SKIP Command failed so no diff ok eps_tutorials-ex34_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_6.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_7.counts not ok eps_tutorials-ex34_6+form_function_ab-0 # Error code: 14 not ok eps_tutorials-ex34_7 # Error code: 14 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30460] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb04e9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30466] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30461] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9554f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30467] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30467] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_7 # SKIP Command failed so no diff ok eps_tutorials-ex34_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_8.counts not ok eps_tutorials-ex34_6+form_function_ab-1 # Error code: 14 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb1369000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30509] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30509] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_9.counts not ok eps_tutorials-ex34_8+form_function_ab-0 # Error code: 14 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30511] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8b574000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30514] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30514] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_8 # SKIP Command failed so no diff not ok eps_tutorials-ex34_9+use_custom_norm-0 # Error code: 14 not ok eps_tutorials-ex34_8+form_function_ab-1 # Error code: 14 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30554] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f91b47000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30561] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30561] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30555] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbdd48000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30560] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30560] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_9 # SKIP Command failed so no diff ok eps_tutorials-ex34_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_10.counts not ok eps_tutorials-ex34_9+use_custom_norm-1 # Error code: 14 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30588] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8dfb7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30603] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30603] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex35_1.counts not ok eps_tutorials-ex34_10+use_custom_norm-0_form_function_ab-0 # Error code: 14 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa85f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30608] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30608] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_10 # SKIP Command failed so no diff not ok eps_tutorials-ex35_1 # Error code: 14 not ok eps_tutorials-ex34_10+use_custom_norm-0_form_function_ab-1 # Error code: 14 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30649] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb8496000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30655] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30655] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex35_1 # SKIP Command failed so no diff # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9a235000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30654] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30654] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_10 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_1.counts not ok eps_tutorials-ex34_10+use_custom_norm-1_form_function_ab-0 # Error code: 14 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30684] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbab41000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30699] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30699] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff not ok eps_tutorials-ex36_1 # Error code: 14 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30696] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f97f18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30702] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30702] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_2.counts not ok eps_tutorials-ex34_10+use_custom_norm-1_form_function_ab-1 # Error code: 14 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30720] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae098000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30744] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_3.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-constant_eps_two_sided-0 # Error code: 14 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30746] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb3a14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30749] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30749] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex36_2 # SKIP Command failed so no diff not ok eps_tutorials-ex36_2+eps_power_shift_type-constant_eps_two_sided-1 # Error code: 14 not ok eps_tutorials-ex36_3 # Error code: 14 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30788] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb55fd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30795] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30795] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_2 # SKIP Command failed so no diff # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30790] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8f0c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30796] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30796] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex4_1.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-rayleigh_eps_two_sided-0 # Error code: 14 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30822] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9d8d5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30840] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30840] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex36_2 # SKIP Command failed so no diff not ok eps_tutorials-ex4_1 # Error code: 14 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30837] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f98c8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30843] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30843] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex41_1.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-rayleigh_eps_two_sided-1 # Error code: 14 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30860] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa35c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30883] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30883] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex41_1_balance.counts not ok eps_tutorials-ex41_1+eps_type-power # Error code: 14 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30887] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd8e0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30890] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30890] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex41_1 # SKIP Command failed so no diff not ok eps_tutorials-ex41_1_balance+eps_balance-oneside # Error code: 14 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30927] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbabaa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30936] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:30936] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! not ok eps_tutorials-ex41_1+eps_type-krylovschur # Error code: 14 ok eps_tutorials-ex41_1_balance # SKIP Command failed so no diff # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30931] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9ae2a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30937] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30937] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex41_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex43_1.counts not ok eps_tutorials-ex41_1_balance+eps_balance-twoside # Error code: 14 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30961] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa317d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30979] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30979] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex41_1_balance # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex43_2.counts not ok eps_tutorials-ex43_1 # Error code: 14 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30981] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb7f5a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex43_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_1.counts not ok eps_tutorials-ex44_1+eps_type-krylovschur # Error code: 14 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31042] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f82485000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31045] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31045] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_1 # SKIP Command failed so no diff not ok eps_tutorials-ex44_1+eps_type-lyapii # Error code: 14 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31063] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9109c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31066] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31066] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_2.counts not ok eps_tutorials-ex44_2+eps_type-krylovschur # Error code: 14 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31093] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8eaa3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31096] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31096] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex44_2 # SKIP Command failed so no diff not ok eps_tutorials-ex43_2 # Error code: 14 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f86c76000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31037] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31037] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31019@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex43_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_3.counts not ok eps_tutorials-ex44_2+eps_type-lyapii # Error code: 14 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fac1e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex46_1.counts not ok eps_tutorials-ex44_3+eps_type-krylovschur # Error code: 14 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31138] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb8b44000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31141] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31141] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_3 # SKIP Command failed so no diff not ok eps_tutorials-ex46_1 # Error code: 14 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa6fc3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31185] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31185] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex46_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex46_2.counts not ok eps_tutorials-ex44_3+eps_type-lyapii # Error code: 14 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9f922000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex44_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex47_1.counts not ok eps_tutorials-ex46_2 # Error code: 14 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31235] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8e6bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31245] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31245] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex46_2 # SKIP Command failed so no diff not ok eps_tutorials-ex47_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1.counts # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31242] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9effd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31248] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31248] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex47_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1_jd.counts not ok eps_tutorials-ex49_1 # Error code: 14 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa5af7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31305] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31305] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex49_1 # SKIP Command failed so no diff not ok eps_tutorials-ex49_1_jd # Error code: 14 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31302] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb8400000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31308] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31308] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1_lobpcg.counts ok eps_tutorials-ex49_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2.counts not ok eps_tutorials-ex49_1_lobpcg # Error code: 14 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa6ebe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex49_1_lobpcg # SKIP Command failed so no diff not ok eps_tutorials-ex49_2 # Error code: 14 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31362] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa7e32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31368] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31368] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2_nost.counts ok eps_tutorials-ex49_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2_par.counts not ok eps_tutorials-ex49_2_nost # Error code: 14 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa712b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex49_2_nost # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex5_1.counts not ok eps_tutorials-ex49_2_par # Error code: 14 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b4db000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31422] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31429] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31429] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31422@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex49_2_par # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex5_2.counts not ok eps_tutorials-ex5_2 # Error code: 14 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31493] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb676a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31496] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31496] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex5_2 # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-0_eps_krylovschur_locking-0 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real.counts # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fafdb1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31464] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31460] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31463] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31463] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31464] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31460@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-0_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9fc18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31540] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31538] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31538] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31540] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31530@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nest-0_nsize-1 # Error code: 14 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31535] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9d342000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31542] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31542] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nest-0_nsize-2 # Error code: 14 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31572] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f82613000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31572@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:31572] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-1_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb9a77000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31578] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31566] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31575] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31578] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31566@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nest-1_nsize-1 # Error code: 14 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa47ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31617] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31617] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nest-1_nsize-2 # Error code: 14 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb8959000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31641] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31640] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31641] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31640] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31637@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-1_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f18d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31618] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31612] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31619] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31618] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31619] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31612@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real_sinvert.counts not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nest-0_nsize-1 # Error code: 14 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31657] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa9230000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31671] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31671] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31685] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa76e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31688] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31688] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31720] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb0a1b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31723] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31723] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb5d87000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31744] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real_sinvert_scalapack.counts not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nest-0_nsize-2 # Error code: 14 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31702] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f7cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31712] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31713] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31712] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31713] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31702@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nest-1_nsize-1 # Error code: 14 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f94520000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb22ac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31777] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31774] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31776] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31776] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31774] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31777] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31771@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nest-1_nsize-2 # Error code: 14 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93046000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31819] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31836] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31836] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31819@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f7fdbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31831] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31833] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31835] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31834] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31835] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31833] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31831] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31834] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31826@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nest-0_nsize-1 # Error code: 14 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31860] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8f2da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31875] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31875] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e5a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31879] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31880] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31879] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31873] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31878] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31880] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31873@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_2_real.counts not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nest-0_nsize-2 # Error code: 14 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb8fb0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31903] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31926] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31926] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31903@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31930] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa0cd4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31933] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31933] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nest-1_nsize-1 # Error code: 14 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9718c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31966] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31966] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31965] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f831e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31969] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31969] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31997] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f952f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32004] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32004] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32022] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb6a98000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nest-1_nsize-2 # Error code: 14 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbbe7d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32000] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32001] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32000] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32001] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31985@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_3.counts not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32039] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fadcf6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_3 # Error code: 14 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c912000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32070] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32070] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex56_1.counts not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f988d7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex56_1_nhep.counts not ok eps_tutorials-ex56_1+nsize-1 # Error code: 14 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbb00e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32117] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32117] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1 # SKIP Command failed so no diff not ok eps_tutorials-ex56_1_nhep+nsize-1 # Error code: 14 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32146] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5217000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32160] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32160] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1_nhep # SKIP Command failed so no diff not ok eps_tutorials-ex56_1+nsize-2 # Error code: 14 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f83817000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32161] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32165] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:32164] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32164] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32165] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-32161@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:32161] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok eps_tutorials-ex56_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_1.counts not ok eps_tutorials-ex56_1_nhep+nsize-2 # Error code: 14 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f89b78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32186] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:32187] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32187] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no*** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32186] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-32183@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1_nhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_1_nhep.counts not ok eps_tutorials-ex57_1+nsize-1 # Error code: 14 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb04a1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32219] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32219] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex57_1 # SKIP Command failed so no diff not ok eps_tutorials-ex57_1_nhep+nsize-1 # Error code: 14 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8d197000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32261] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32261] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex57_1_nhep # SKIP Command failed so no diff not ok eps_tutorials-ex57_1_nhep+nsize-2 # Error code: 14 not ok eps_tutorials-ex57_1+nsize-2 # Error code: 14 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb94c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32283] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32286] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:32287] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32286] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:32287] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-32283@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex57_1_nhep # SKIP Command failed so no diff # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32258] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9f698000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32264] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:32265] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32265] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32264] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-32258@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex57_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex7_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_2.counts not ok eps_tutorials-ex7_1 # Error code: 14 not ok eps_tutorials-ex57_2 # Error code: 14 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32341] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f81156000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32346] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32346] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex7_1 # SKIP Command failed so no diff # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32340] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faa00c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32347] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32347] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex57_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex7_3.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_1.counts not ok eps_tutorials-ex9_1+eps_two_sided-0_eps_type-krylovschur # Error code: 14 not ok eps_tutorials-ex7_3 # Error code: 14 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faf1cf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32406] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32406] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32399] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8a90c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32407] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32407] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex7_3 # SKIP Command failed so no diff ok eps_tutorials-ex9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_2.counts not ok eps_tutorials-ex9_1+eps_two_sided-0_eps_type-lapack # Error code: 14 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32435] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f822ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32449] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32449] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff not ok eps_tutorials-ex9_2 # Error code: 14 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32451] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa310f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_3.counts not ok eps_tutorials-ex9_1+eps_two_sided-1_eps_type-krylovschur # Error code: 14 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faa62d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff not ok eps_tutorials-ex9_3 # Error code: 14 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32498] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faf5b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_4.counts not ok eps_tutorials-ex9_1+eps_two_sided-1_eps_type-lapack # Error code: 14 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32517] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbe80f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32532] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32532] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_5.counts not ok eps_tutorials-ex9_4 # Error code: 14 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32545] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faa381000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32548] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32548] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_7.counts not ok eps_tutorials-ex9_5 # Error code: 14 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fab41d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32593] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32593] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_8.counts not ok eps_tutorials-ex9_7 # Error code: 14 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8223c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32608] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32608] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_9.counts not ok eps_tutorials-ex9_8 # Error code: 14 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b1b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32658] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32658] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1.counts not ok eps_tutorials-ex9_9 # Error code: 14 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32665] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fadde6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1_cross_gd.counts not ok svd_tests-test1_1+type-lanczos # Error code: 14 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32698] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8a2f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32718] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:32718] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test1_1_cross_gd # Error code: 14 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb2b36000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32728] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32728] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1_cross_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1_cyclic_gd.counts not ok svd_tests-test1_1+type-trlanczos # Error code: 14 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32742] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f91be0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32747] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32747] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test1_1_cyclic_gd # Error code: 14 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc4ea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1_cyclic_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_1.counts not ok svd_tests-test1_1+type-cross # Error code: 14 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32789] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8643a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32792] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32792] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test10_1 # Error code: 14 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32819] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f958c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32824] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32824] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_2.counts not ok svd_tests-test1_1+type-cyclic # Error code: 14 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32836] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f85f33000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32839] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32839] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test10_2 # Error code: 14 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32868] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f84036000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32883] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32883] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test10_2 # SKIP Command failed so no diff not ok svd_tests-test1_1+type-lapack # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_3.counts # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32880] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9310b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32886] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32886] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test11_1.counts not ok svd_tests-test10_3 # Error code: 14 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f96685000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test10_3 # SKIP Command failed so no diff not ok svd_tests-test11_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test12_1.counts # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32940] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fae443000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1.counts not ok svd_tests-test12_1 # Error code: 14 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32993] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fadf39000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33003] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33003] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test12_1 # SKIP Command failed so no diff not ok svd_tests-test14_1+svd_type-lanczos # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1_cross.counts # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33000] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f94fa5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_1 # SKIP Command failed so no diff not ok svd_tests-test14_1+svd_type-trlanczos # Error code: 14 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33040] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa08f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33050] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33050] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1 # SKIP Command failed so no diff not ok svd_tests-test14_1_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33047] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f815ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33053] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33053] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_1_cross # SKIP Command failed so no diff not ok svd_tests-test14_1+svd_type-lapack # Error code: 14 not ok svd_tests-test14_1_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33071] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9cb34000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33086] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33086] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1 # SKIP Command failed so no diff # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33081] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b848000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33087] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1_cyclic.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2.counts not ok svd_tests-test14_1_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 not ok svd_tests-test14_2+svd_type-lanczos # Error code: 14 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33140] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa4e8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33146] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33146] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33141] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fabc22000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33147] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33147] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1_cyclic # SKIP Command failed so no diff ok svd_tests-test14_2 # SKIP Command failed so no diff not ok svd_tests-test14_1_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 not ok svd_tests-test14_2+svd_type-trlanczos # Error code: 14 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faead9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33181] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33181] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1_cyclic # SKIP Command failed so no diff # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f885a1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33180] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33180] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2_cross.counts not ok svd_tests-test14_2+svd_type-lapack # Error code: 14 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33209] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97aa6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33223] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33223] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2_cyclic.counts not ok svd_tests-test14_2_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33225] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa520c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33228] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33228] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cross # SKIP Command failed so no diff not ok svd_tests-test14_2_cross+svd_cross_explicitmatrix-1 # Error code: 14 not ok svd_tests-test14_2_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33269] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f86008000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33275] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33275] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa099d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33274] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33274] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cross # SKIP Command failed so no diff ok svd_tests-test14_2_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_1.counts not ok svd_tests-test14_2_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33302] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9ff24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_2.counts not ok svd_tests-test15_1+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33319] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f90045000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_1 # SKIP Command failed so no diff not ok svd_tests-test15_2 # Error code: 14 not ok svd_tests-test15_1+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33361] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbeaed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33368] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33368] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8df16000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_1 # SKIP Command failed so no diff ok svd_tests-test15_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_3.counts not ok svd_tests-test15_1+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33396] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f84cde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33413] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test15_1 # SKIP Command failed so no diff not ok svd_tests-test15_3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_4.counts # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c0bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33416] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33416] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_lapack.counts not ok svd_tests-test15_4 # Error code: 14 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33463] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa88f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33473] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test15_4 # SKIP Command failed so no diff not ok svd_tests-test16_1_lapack # Error code: 14 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f91b5a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33476] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_cyclic.counts not ok svd_tests-test16_1_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 not ok svd_tests-test16_1_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa7277000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_cross # SKIP Command failed so no diff # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f964d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33536] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33536] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_cyclic # SKIP Command failed so no diff not ok svd_tests-test16_1_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 not ok svd_tests-test16_1_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33562] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c00a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33569] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33569] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33564] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb2ce2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33570] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33570] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_cross # SKIP Command failed so no diff ok svd_tests-test16_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_trlanczos_par.counts not ok svd_tests-test16_1_trlanczos+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33623] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c700000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33629] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33629] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test16_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test16_1_trlanczos+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33649] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa4676000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33652] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33652] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_1.counts not ok svd_tests-test16_1_trlanczos_par+ds_parallel-redundant # Error code: 14 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f934cc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33630] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:33631] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33631] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:33630] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-33624@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tests-test16_1_trlanczos_par # SKIP Command failed so no diff not ok svd_tests-test18_1+svd_type-lapack # Error code: 14 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33679] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0d18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff not ok svd_tests-test16_1_trlanczos_par+ds_parallel-synchronized # Error code: 14 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9e776000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33693] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:33697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:33698] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33698] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-33693@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tests-test16_1_trlanczos_par # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_1_trlanczos.counts not ok svd_tests-test18_1+svd_type-cross # Error code: 14 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33716] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f5d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33732] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33732] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33744] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9bd29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33747] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33747] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test18_1+svd_type-cyclic # Error code: 14 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8effe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33778] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33778] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_2.counts not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33777] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3faad5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33781] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33781] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-lower # Error code: 14 not ok svd_tests-test18_2 # Error code: 14 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33822] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa62e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33827] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33827] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test18_2 # SKIP Command failed so no diff # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33820] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f93bc4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33828] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33828] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test19_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test2_1.counts ok svd_tests-test2_1 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test20_1.counts not ok svd_tests-test19_1 # Error code: 14 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33882] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fadb03000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33896] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33896] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lanczos.counts not ok svd_tests-test20_1 # Error code: 14 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33898] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93d90000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33901] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33901] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lanczos_one.counts not ok svd_tests-test3_1_lanczos # Error code: 14 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33943] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb1965000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33958] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33958] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos.counts not ok svd_tests-test3_1_lanczos_one # Error code: 14 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2488000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33961] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33961] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_lanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one.counts not ok svd_tests-test3_1_trlanczos+svd_trlanczos_locking-0 # Error code: 14 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34005] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb8d05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34018] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34018] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test3_1_trlanczos_one # Error code: 14 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34017] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f86ab4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34021] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34021] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_trlanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one_mgs.counts not ok svd_tests-test3_1_trlanczos+svd_trlanczos_locking-1 # Error code: 14 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fad9cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34057] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34057] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one_always.counts not ok svd_tests-test3_1_trlanczos_one_mgs # Error code: 14 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8a730000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34068] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34068] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_trlanczos_one_mgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cross.counts not ok svd_tests-test3_1_trlanczos_one_always # Error code: 14 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34098] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb61f8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34118] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34118] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_trlanczos_one_always # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cross_exp.counts not ok svd_tests-test3_1_cross # Error code: 14 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9c3e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34128] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34128] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cyclic.counts not ok svd_tests-test3_1_cross_exp # Error code: 14 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34160] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9c750000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34179] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34179] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_cross_exp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cyclic_exp.counts not ok svd_tests-test3_1_cyclic # Error code: 14 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f879ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lapack.counts not ok svd_tests-test3_1_cyclic_exp # Error code: 14 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34221] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9b407000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34239] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34239] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_cyclic_exp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_randomized.counts not ok svd_tests-test3_1_lapack # Error code: 14 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34245] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9eb83000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34248] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34248] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lanczos.counts not ok svd_tests-test3_1_randomized # Error code: 14 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34281] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbf06d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34298] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34298] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_randomized # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lanczos_one.counts not ok svd_tests-test3_2_lanczos # Error code: 14 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34305] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa5954000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34308] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34308] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos.counts not ok svd_tests-test3_2_lanczos_one # Error code: 14 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34341] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4d8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34359] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34359] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_lanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one.counts not ok svd_tests-test3_2_trlanczos # Error code: 14 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34365] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f7fe5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34368] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34368] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one_mgs.counts not ok svd_tests-test3_2_trlanczos_one # Error code: 14 not ok svd_tests-test3_2_trlanczos_one_mgs # Error code: 14 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34422] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa7c40000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34428] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34415] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb1f42000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_trlanczos_one # SKIP Command failed so no diff ok svd_tests-test3_2_trlanczos_one_mgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one_always.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cross.counts not ok svd_tests-test3_2_cross # Error code: 14 not ok svd_tests-test3_2_trlanczos_one_always # Error code: 14 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34481] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8f7c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34488] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34488] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34482] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb78f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34487] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34487] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_trlanczos_one_always # SKIP Command failed so no diff ok svd_tests-test3_2_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cross_exp.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cyclic.counts not ok svd_tests-test3_2_cyclic # Error code: 14 not ok svd_tests-test3_2_cross_exp # Error code: 14 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f881f5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34547] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34547] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34541] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa4bab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34548] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34548] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_cyclic # SKIP Command failed so no diff ok svd_tests-test3_2_cross_exp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lapack.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_randomized.counts not ok svd_tests-test3_2_lapack # Error code: 14 not ok svd_tests-test3_2_randomized # Error code: 14 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34601] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f95452000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34608] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34608] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_lapack # SKIP Command failed so no diff # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8b15f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34607] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34607] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_randomized # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_4.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_5.counts not ok svd_tests-test3_4 # Error code: 14 not ok svd_tests-test3_5 # Error code: 14 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34660] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f866b3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34669] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:34668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34669] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-34660@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tests-test3_4 # SKIP Command failed so no diff # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34662] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa625a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_lanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_randomized.counts not ok svd_tests-test4_1_lanczos # Error code: 14 not ok svd_tests-test4_1_randomized # Error code: 14 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34724] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9a858000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34731] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34731] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_lanczos # SKIP Command failed so no diff # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8140d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34729] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34729] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_randomized # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross.counts not ok svd_tests-test4_1_trlanczos # Error code: 14 not ok svd_tests-test4_1_cross # Error code: 14 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34784] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f98f81000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34790] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34790] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2098000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34791] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34791] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_trlanczos # SKIP Command failed so no diff ok svd_tests-test4_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross_exp.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross_exp_imp.counts not ok svd_tests-test4_1_cross_exp # Error code: 14 not ok svd_tests-test4_1_cross_exp_imp # Error code: 14 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34843] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbaf72000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34845] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f838b1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34850] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34850] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_cross_exp # SKIP Command failed so no diff ok svd_tests-test4_1_cross_exp_imp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic_imp.counts not ok svd_tests-test4_1_cyclic # Error code: 14 not ok svd_tests-test4_1_cyclic_imp # Error code: 14 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb46ec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34911] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34911] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_cyclic # SKIP Command failed so no diff # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34905] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2240000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34910] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34910] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_cyclic_imp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic_exp.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_lapack.counts not ok svd_tests-test4_1_cyclic_exp # Error code: 14 not ok svd_tests-test4_1_lapack # Error code: 14 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34965] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb3a86000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34970] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34970] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_cyclic_exp # SKIP Command failed so no diff # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34964] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f809000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_scalapack.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_3.counts not ok svd_tests-test4_1_scalapack # Error code: 14 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35024] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f91a4c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35030] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35030] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_1.counts not ok svd_tests-test5_1 # Error code: 14 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35063] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0779000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35066] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35066] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_2.counts not ok svd_tests-test4_3 # Error code: 14 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8d75e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35025] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35032] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35032] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35025@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tests-test4_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_3.counts not ok svd_tests-test5_2 # Error code: 14 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35111] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbd8ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35121] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35121] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_4.counts not ok svd_tests-test5_3 # Error code: 14 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa7f35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test5_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test6_1_subspace.counts not ok svd_tests-test5_4 # Error code: 14 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35171] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb627b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35181] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35181] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test5_4 # SKIP Command failed so no diff not ok svd_tests-test6_1_subspace # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test6_1_lobpcg.counts # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35178] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8be69000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35184] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35184] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test6_1_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test7_1.counts not ok svd_tests-test6_1_lobpcg # Error code: 14 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35231] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f899cf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35241] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35241] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test6_1_lobpcg # SKIP Command failed so no diff not ok svd_tests-test7_1 # Error code: 14 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35238] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f866ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35244] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35244] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test8_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test9_1.counts not ok svd_tests-test8_1+svd_type-lanczos # Error code: 14 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35292] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa5f43000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35301] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35301] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-lanczos # Error code: 14 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9abdb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35304] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test8_1+svd_type-trlanczos # Error code: 14 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35324] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa2b4e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35335] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35335] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-trlanczos # Error code: 14 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35332] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb60e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35338] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35338] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test8_1+svd_type-cross # Error code: 14 not ok svd_tests-test9_1+svd_type-cross # Error code: 14 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc558000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35371] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35371] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7b9f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test8_1 # SKIP Command failed so no diff ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-cyclic # Error code: 14 not ok svd_tests-test8_1+svd_type-cyclic # Error code: 14 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb34d1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35405] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35405] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35399] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa92df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35406] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35406] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test9_1 # SKIP Command failed so no diff ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-lapack # Error code: 14 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35433] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8ba1e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35439] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35439] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # not ok svd_tests-test8_1+svd_type-lapack # Error code: 14 ok svd_tests-test9_1 # SKIP Command failed so no diff # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fac901000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35440] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35440] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-randomized # Error code: 14 not ok svd_tests-test8_1+svd_type-randomized # Error code: 14 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35464] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbf276000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35473] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35468] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8936d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test9_1 # SKIP Command failed so no diff ok svd_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_1_scalapack.counts not ok svd_tutorials-ex14_1+svd_type-trlanczos # Error code: 14 not ok svd_tutorials-ex14_1_scalapack+nsize-1 # Error code: 14 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35528] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f962f8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35533] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35533] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb88ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_1 # SKIP Command failed so no diff ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-lanczos # Error code: 14 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbe782000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35567] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35567] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_1 # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-randomized # Error code: 14 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35587] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9672a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_1 # SKIP Command failed so no diff not ok svd_tutorials-ex14_1_scalapack+nsize-2 # Error code: 14 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa38e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35561] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35569] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35568] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35569] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35568] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35561@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-cross # Error code: 14 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35604] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f884f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35618] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35618] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_2.counts not ok svd_tutorials-ex14_1_scalapack+nsize-3 # Error code: 14 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbda54000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35623] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35624] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35622] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35623] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35622] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35624] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35619@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_2_cross.counts not ok svd_tutorials-ex14_2 # Error code: 14 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35675] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8fecc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35685] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35685] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_2 # SKIP Command failed so no diff not ok svd_tutorials-ex14_2_cross # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_3.counts # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbb479000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35688] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35688] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_2_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_4.counts not ok svd_tutorials-ex14_3 # Error code: 14 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35735] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab7c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35745] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35745] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_3 # SKIP Command failed so no diff not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-trlanczos # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15_1.counts # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35742] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fafc80000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35748] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35748] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-lanczos # Error code: 14 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35782] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa4cfb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35792] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35792] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1 # Error code: 14 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35789] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b2f7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35795] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35795] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15_1_scalapack.counts not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-cross # Error code: 14 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35816] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8e3d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35839] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35839] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1_scalapack+nsize-1 # Error code: 14 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35838] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8d35d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35842] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35842] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex15_1_scalapack # SKIP Command failed so no diff not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-trlanczos # Error code: 14 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35858] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f86390000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35873] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35873] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1_scalapack+nsize-2 # Error code: 14 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f97a0b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35876] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35877] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35876] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35877] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35871@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex15_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_1.counts not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-lanczos # Error code: 14 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35895] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a093000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35916] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35916] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex45_1 # Error code: 14 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35923] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa71fb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35926] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35926] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_2.counts not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-cross # Error code: 14 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35942] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8724f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35956] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35956] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_3.counts not ok svd_tutorials-ex45_2 # Error code: 14 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35970] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa57f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35973] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35973] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_4.counts not ok svd_tutorials-ex45_3 # Error code: 14 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36002] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f95d47000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36018] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36018] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5.counts not ok svd_tutorials-ex45_4 # Error code: 14 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36030] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f843d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36033] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36033] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cross.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36062] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f95737000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36078] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36078] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cross # Error code: 14 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fac02f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36093] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36093] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cross_implicit.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36107] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f862f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cross_implicit # Error code: 14 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36137] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faeccb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36142] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36142] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5_cross_implicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cyclic.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36154] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9aadc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36157] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36157] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36186] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9e88c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36201] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36201] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36200] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbeecb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36204] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36204] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6.counts not ok svd_tutorials-ex45_5_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36220] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2803000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36237] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36237] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6_cross.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa65bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36251] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36251] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36280] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83620000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36295] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36295] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36294] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa3f35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36298] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36298] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4d7d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36329] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36329] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6_cyclic.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36328] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb00b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36332] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36332] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36373] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f946a8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36378] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36378] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36371] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2305000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36379] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36379] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36395] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f875aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36398] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36398] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa3623000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36413] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-single_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36441] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c3ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36446] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36446] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-single_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36471] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f960cf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36457] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb45e0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa5b4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36504] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83ecf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36516] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36516] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36525] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbe3f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36528] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36528] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36544] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9de76000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36559] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36559] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36556] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fad2a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36562] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36562] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb1e5a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36595] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36595] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36590] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa6947000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36596] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36596] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa13c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36630] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36630] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36623] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8ffa1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36629] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36629] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7_cross.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36657] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbe702000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36674] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36674] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7_cyclic.counts not ok svd_tutorials-ex45_7_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36671] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f81ed2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36677] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36677] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_7_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36712] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f81e4a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_7_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_7_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa06e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_7_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_8.counts not ok svd_tutorials-ex45_7_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36758] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9e909000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36768] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36768] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-upper_svd_trlanczos_scale-0.1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1.counts # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36765] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8d7a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36771] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36771] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_8 # SKIP Command failed so no diff not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-upper_svd_trlanczos_scale--20 # Error code: 14 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36806] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f97c79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36815] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36815] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_8 # SKIP Command failed so no diff not ok svd_tutorials-ex48_1+svd_trlanczos_explicitmatrix-0 # Error code: 14 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36812] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa1728000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36818] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36818] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1 # SKIP Command failed so no diff not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-lower_svd_trlanczos_scale-0.1 # Error code: 14 not ok svd_tutorials-ex48_1+svd_trlanczos_explicitmatrix-1 # Error code: 14 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36837] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9677b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_8 # SKIP Command failed so no diff # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36846] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f85d49000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_spqr.counts not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-lower_svd_trlanczos_scale--20 # Error code: 14 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36879] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbc3b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36896] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36896] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_autoscale.counts not ok svd_tutorials-ex48_1_spqr+svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36893] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8aede000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_1_spqr # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_spqr+svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa2e28000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_spqr # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36940] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa13e1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_cross.counts not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36979] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f81e88000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36990] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36990] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_cross # Error code: 14 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7851000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_cyclic.counts not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37009] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa7e65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37030] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37030] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_cyclic # Error code: 14 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb69c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37040] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37040] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4.counts not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37055] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f82bd8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_spqr.counts not ok svd_tutorials-ex48_4 # Error code: 14 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37084] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa7a8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37087] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cross.counts not ok svd_tutorials-ex48_4_spqr # Error code: 14 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37116] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbe95d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4_spqr # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cross_implicit.counts not ok svd_tutorials-ex48_4_cross # Error code: 14 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37144] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f7fafe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37147] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37147] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_4_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cyclic.counts not ok svd_tutorials-ex48_4_cross_implicit # Error code: 14 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37176] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f88e8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37190] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37190] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4_cross_implicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_5.counts not ok svd_tutorials-ex48_4_cyclic # Error code: 14 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37204] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbc08d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37207] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37207] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_4_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_5_cross.counts not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37236] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae264000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37250] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37250] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_5 # SKIP Command failed so no diff not ok svd_tutorials-ex48_5_cross # Error code: 14 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37264] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9095a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_6_cross.counts not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37281] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb8123000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37286] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37286] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_5 # SKIP Command failed so no diff not ok svd_tutorials-ex48_6_cross # Error code: 14 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37311] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9f709000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37315] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37315] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_6_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_6_cyclic.counts not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37328] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa8c93000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37331] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37331] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex51_1.counts not ok svd_tutorials-ex48_6_cyclic # Error code: 14 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f80f65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_6_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex51_1+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37388] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f92f01000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37391] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37391] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex51_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex51_2.counts not ok svd_tutorials-ex51_1+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f86151000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex51_1 # SKIP Command failed so no diff not ok svd_tutorials-ex51_2 # Error code: 14 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9e868000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37452] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37452] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37449@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tutorials-ex51_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_cyclic.counts ok svd_tutorials-ex52_1_cross # SKIP Requires DATAFILESPATH ok svd_tutorials-ex52_1_cyclic # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_lapack.counts ok svd_tutorials-ex52_1_trlanczos # SKIP Requires DATAFILESPATH ok svd_tutorials-ex52_1_lapack # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_cyclic.counts ok svd_tutorials-ex52_2_cross # SKIP Requires DATAFILESPATH ok svd_tutorials-ex52_2_cyclic # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_cross.counts ok svd_tutorials-ex52_2_trlanczos # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_cyclic.counts not ok svd_tutorials-ex52_5_cross # Error code: 14 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37573] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa4a3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37589] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37589] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_trlanczos.counts not ok svd_tutorials-ex52_5_cyclic # Error code: 14 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37588] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f87a47000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37592] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37592] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex52_5_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_lapack.counts not ok svd_tutorials-ex52_5_trlanczos # Error code: 14 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37638] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e1ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37649] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37649] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_6.counts not ok svd_tutorials-ex52_5_lapack # Error code: 14 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbb339000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37652] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37652] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex52_5_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_trlanczos.counts not ok svd_tutorials-ex52_6+svd_type-trlanczos # Error code: 14 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37698] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8d5c3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37709] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37709] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_6 # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-redundant_nsize-1 # Error code: 14 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f83976000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37712] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37712] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex52_6+svd_type-cross # Error code: 14 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37728] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9b75f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37743] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37743] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_cross.counts not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-redundant_nsize-2 # Error code: 14 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f87c6d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37747] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37746] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37746] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37747] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37741@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-synchronized_nsize-1 # Error code: 14 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37786] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9f1b7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37793] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37793] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_cross+nsize-1 # Error code: 14 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37790] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbc35b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37796] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37796] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cross # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-synchronized_nsize-2 # Error code: 14 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb4467000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37831] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37817] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37831] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37817@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_cyclic.counts not ok svd_tutorials-ex53_1_cyclic+nsize-1 # Error code: 14 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37865] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faa219000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_cyclic+nsize-2 # Error code: 14 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa11b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37882] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37885] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37886] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37886] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:37885] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37882@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex8_1.counts not ok svd_tutorials-ex8_1 # Error code: 14 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37915] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb0447000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37918] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex8_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/embedgsvd RM test-rm-svd.F90 TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test1_1.counts not ok pep_tests-test1_1+type-toar # Error code: 14 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37950] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97c0b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37953] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37953] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test1_1 # SKIP Command failed so no diff not ok pep_tests-test1_1+type-qarnoldi # Error code: 14 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37967] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae144000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37970] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37970] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1 # SKIP Command failed so no diff not ok pep_tests-test1_1+type-linear # Error code: 14 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37984] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf83c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37987] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37987] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test1_1_linear_gd.counts not ok pep_tests-test1_1_linear_gd # Error code: 14 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38014] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9bae5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38017] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38017] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1_linear_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test10_1.counts not ok pep_tests-test10_1 # Error code: 14 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38044] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f994f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38047] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38047] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test11_1.counts not ok pep_tests-test11_1 # Error code: 14 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38074] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f94faf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test12_1.counts not ok pep_tests-test12_1+pep_type-toar # Error code: 14 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38104] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f97538000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38107] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38107] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff not ok pep_tests-test12_1+pep_type-linear # Error code: 14 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb0b38000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff not ok pep_tests-test12_1+pep_type-qarnoldi # Error code: 14 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38138] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f91e70000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38141] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38141] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1.counts not ok pep_tests-test2_1+pep_type-toar # Error code: 14 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38168] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f94019000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38171] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38171] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_1 # SKIP Command failed so no diff not ok pep_tests-test2_1+pep_type-linear # Error code: 14 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f82346000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_toar_mgs.counts not ok pep_tests-test2_1_toar_mgs # Error code: 14 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38215] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb70c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38218] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38218] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_1_toar_mgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_qarnoldi.counts not ok pep_tests-test2_1_qarnoldi # Error code: 14 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38245] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f883ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38248] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38248] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_1_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_linear_gd.counts not ok pep_tests-test2_1_linear_gd # Error code: 14 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38275] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa119d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38278] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38278] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_1_linear_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2.counts not ok pep_tests-test2_2+pep_type-toar # Error code: 14 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38305] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9df91000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38308] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38308] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2 # SKIP Command failed so no diff not ok pep_tests-test2_2+pep_type-linear # Error code: 14 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38322] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa1a4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38325] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38325] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_toar_scaleboth.counts not ok pep_tests-test2_2_toar_scaleboth # Error code: 14 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38352] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9b61e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38355] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38355] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_toar_scaleboth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_toar_transform.counts not ok pep_tests-test2_2_toar_transform # Error code: 14 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38382] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbd792000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38385] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38385] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_toar_transform # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_qarnoldi.counts not ok pep_tests-test2_2_qarnoldi # Error code: 14 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38412] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbb1b1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38415] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38415] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_linear_explicit.counts not ok pep_tests-test2_2_linear_explicit # Error code: 14 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38442] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa522d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38445] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38445] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_linear_explicit_her.counts not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-0,1 # Error code: 14 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95955000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-1,0 # Error code: 14 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38489] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7a32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38492] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38492] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-.3,.7 # Error code: 14 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38506] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa7385000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38509] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38509] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_stoar.counts not ok pep_tests-test2_2_stoar # Error code: 14 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38536] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f99aac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38539] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38539] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_jd.counts not ok pep_tests-test2_2_jd # Error code: 14 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38566] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb5f7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38569] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38569] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_3.counts not ok pep_tests-test2_3+pep_extract-none # Error code: 14 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38596] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8e0c7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_3+pep_extract-norm # Error code: 14 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38613] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb968c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_3+pep_extract-residual # Error code: 14 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9334b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38633] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38633] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_3+pep_extract-structured # Error code: 14 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fae056000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38650] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38650] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_schur.counts not ok pep_tests-test2_4_schur # Error code: 14 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38677] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa9266000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38680] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38680] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_mbe.counts not ok pep_tests-test2_4_mbe # Error code: 14 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38707] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa964b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38710] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38710] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_explicit.counts not ok pep_tests-test2_4_explicit # Error code: 14 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38737] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8fc94000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_schur.counts not ok pep_tests-test2_4_multiple_schur # Error code: 14 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38767] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb84fc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38770] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38770] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_multiple_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_mbe.counts not ok pep_tests-test2_4_multiple_mbe # Error code: 14 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa24f5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_multiple_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_explicit.counts not ok pep_tests-test2_4_multiple_explicit # Error code: 14 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38827] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f88557000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38830] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38830] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_4_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_5.counts not ok pep_tests-test2_5 # Error code: 14 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbaebc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:38861] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:38860] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38861] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38860] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-38857@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_6.counts not ok pep_tests-test2_6+pep_extract-none # Error code: 14 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9362f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38893] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38893] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_6 # SKIP Command failed so no diff not ok pep_tests-test2_6+pep_extract-norm # Error code: 14 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38907] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f939c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38910] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38910] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_6 # SKIP Command failed so no diff not ok pep_tests-test2_6+pep_extract-residual # Error code: 14 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38924] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f80ad5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38927] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_7.counts not ok pep_tests-test2_7+pep_extract-none # Error code: 14 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbd24b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38957] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38957] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_7+pep_extract-norm # Error code: 14 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38971] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faa4b7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38974] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38974] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_7+pep_extract-residual # Error code: 14 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f983b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38991] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38991] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_7+pep_extract-structured # Error code: 14 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39005] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f881f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39008] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39008] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_schur.counts not ok pep_tests-test2_8_schur # Error code: 14 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f7f92e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_mbe.counts not ok pep_tests-test2_8_mbe # Error code: 14 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb60d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39068] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39068] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_explicit.counts not ok pep_tests-test2_8_explicit # Error code: 14 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39095] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8fb29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39098] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39098] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_schur.counts not ok pep_tests-test2_8_multiple_schur # Error code: 14 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb0190000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39128] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39128] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_multiple_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_mbe.counts not ok pep_tests-test2_8_multiple_mbe # Error code: 14 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39155] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8e891000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39158] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39158] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_8_multiple_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_explicit.counts not ok pep_tests-test2_8_multiple_explicit # Error code: 14 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f80a9c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_mbe.counts not ok pep_tests-test2_9_mbe # Error code: 14 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f978b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39215] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39218] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39219] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39219] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39218] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39215@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_9_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_explicit.counts not ok pep_tests-test2_9_explicit # Error code: 14 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa5d73000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39252] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39251] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39251] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39252] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39248@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_9_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_multiple_mbe.counts not ok pep_tests-test2_9_multiple_mbe # Error code: 14 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa0ed3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39281] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39284] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39285] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39285] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39284] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39281@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_9_multiple_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_multiple_explicit.counts not ok pep_tests-test2_9_multiple_explicit # Error code: 14 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f80b05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39318] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39318] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39314@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_9_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_10.counts not ok pep_tests-test2_10 # Error code: 14 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb815e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39350] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39352] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39351] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39351] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39352] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39350] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39347@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok pep_tests-test2_10 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_12.counts not ok pep_tests-test2_12 # Error code: 14 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb424a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39386] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39390] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39389] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39390] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39389] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39386@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_12 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_13.counts not ok pep_tests-test2_13 # Error code: 14 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a805000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_13 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test3_1.counts not ok pep_tests-test3_1 # Error code: 14 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f81a4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39452] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39452] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test4_1_real.counts not ok pep_tests-test4_1_real # Error code: 14 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39479] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb623f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39482] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test4_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_1.counts not ok pep_tests-test5_1 # Error code: 14 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39509] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb8565000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39512] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39512] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_2.counts not ok pep_tests-test5_2 # Error code: 14 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb28df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39542] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39542] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_3.counts not ok pep_tests-test5_3 # Error code: 14 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4a9a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39572] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_4.counts not ok pep_tests-test5_4 # Error code: 14 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39599] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb8e54000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39602] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39602] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_5.counts not ok pep_tests-test5_5 # Error code: 14 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39629] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8cec6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39632] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39632] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test6_1.counts not ok pep_tests-test6_1+pep_type-toar # Error code: 14 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39659] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8d3f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39662] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39662] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff not ok pep_tests-test6_1+pep_type-qarnoldi # Error code: 14 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39676] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb40fe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39679] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39679] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff not ok pep_tests-test6_1+pep_type-linear # Error code: 14 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39693] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8ca6a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39696] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39696] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test6_2.counts not ok pep_tests-test6_2 # Error code: 14 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39723] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9704b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39726] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39726] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test7_1.counts not ok pep_tests-test7_1 # Error code: 14 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39753] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f816c3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39756] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39756] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test8_1.counts not ok pep_tests-test8_1 # Error code: 14 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f88977000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39786] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39786] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test9_1.counts not ok pep_tests-test9_1 # Error code: 14 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39813] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9c54f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39816] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39816] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1.counts not ok pep_tutorials-ex16_1+pep_type-toar # Error code: 14 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39843] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faa2ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1 # SKIP Command failed so no diff not ok pep_tutorials-ex16_1+pep_type-qarnoldi # Error code: 14 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39860] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa0983000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39863] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39863] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_linear.counts not ok pep_tutorials-ex16_1_linear # Error code: 14 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39890] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9dec9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39893] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39893] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_linear_symm.counts not ok pep_tutorials-ex16_1_linear_symm # Error code: 14 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39920] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f86967000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_linear_symm # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_stoar.counts not ok pep_tutorials-ex16_1_stoar # Error code: 14 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39950] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf606000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39953] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39953] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_stoar_t.counts not ok pep_tutorials-ex16_1_stoar_t # Error code: 14 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39980] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb0614000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39983] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39983] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_stoar_t # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex17_1.counts not ok pep_tutorials-ex17_1+pep_type-toar # Error code: 14 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8a2be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40013] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40013] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex17_1 # SKIP Command failed so no diff not ok pep_tutorials-ex17_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40027] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c519000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40030] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40030] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex17_1 # SKIP Command failed so no diff not ok pep_tutorials-ex17_1+pep_type-linear # Error code: 14 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40044] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f964ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40047] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40047] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex28_1.counts not ok pep_tutorials-ex28_1+pep_type-toar # Error code: 14 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40074] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb0b65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff not ok pep_tutorials-ex28_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40091] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa2ae9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40094] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40094] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff not ok pep_tutorials-ex28_1+pep_type-linear # Error code: 14 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f89a5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex38_1.counts not ok pep_tutorials-ex38_1 # Error code: 14 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40138] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8d575000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40141] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40141] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex38_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex38_2.counts not ok pep_tutorials-ex38_2 # Error code: 14 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40168] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbe16d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40171] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40171] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex38_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex40_1.counts not ok pep_tutorials-ex40_1 # Error code: 14 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40198] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa4763000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40201] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40201] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex40_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex40_1_transform.counts not ok pep_tutorials-ex40_1_transform # Error code: 14 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbcf32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40231] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40231] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex40_1_transform # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex50_1.counts not ok pep_tutorials-ex50_1+pep_type-toar # Error code: 14 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40258] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb2e05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40261] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40261] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex50_1 # SKIP Command failed so no diff not ok pep_tutorials-ex50_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c5e0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40278] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40278] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex50_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex50_1_linear.counts not ok pep_tutorials-ex50_1_linear # Error code: 14 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40305] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd650000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40308] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40308] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex50_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_1.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-toar # Error code: 14 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40335] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f891ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40338] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40338] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40352] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8e8c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40355] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40355] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-linear # Error code: 14 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab18f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_1_stoar.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_1_stoar # Error code: 14 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40399] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb0f9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40402] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40402] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_2.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-none # Error code: 14 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f88bcf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-norm # Error code: 14 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40446] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c026000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40449] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40449] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-residual # Error code: 14 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40463] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f80126000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40466] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_3.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-none # Error code: 14 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40493] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f89ef2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40496] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40496] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-norm # Error code: 14 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40510] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae2ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40513] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40513] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-residual # Error code: 14 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb83f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40530] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40530] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_4.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_4 # Error code: 14 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40557] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd8aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40560] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40560] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_1.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40587] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9fc22000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_2d_1+pep_type-linear # Error code: 14 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40604] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb353f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40607] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40607] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_1_toar.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_1_toar # Error code: 14 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40634] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa6f7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40637] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40637] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_1_toar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_2 # Error code: 14 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa12bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b # Error code: 14 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40694] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9df15000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab # Error code: 14 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40724] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb3ee3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40727] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40727] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_1_toar.counts not ok pep_tutorials_nlevp-butterfly_1_toar # Error code: 14 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40754] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9d176000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40757] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40757] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_1_toar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_1_linear.counts not ok pep_tutorials_nlevp-butterfly_1_linear # Error code: 14 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40784] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84e9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40787] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40787] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_2.counts not ok pep_tutorials_nlevp-butterfly_2+pep_type-toar # Error code: 14 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40814] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faf5a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40817] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40817] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-butterfly_2+pep_type-linear # Error code: 14 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40831] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e149000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40834] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40834] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1.counts not ok pep_tutorials_nlevp-damped_beam_1+pep_type-toar # Error code: 14 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa93a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40864] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40864] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-damped_beam_1+pep_type-linear # Error code: 14 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40878] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f0e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40881] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40881] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1_qarnoldi.counts not ok pep_tutorials_nlevp-damped_beam_1_qarnoldi # Error code: 14 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40908] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8a44c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40911] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40911] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1_jd.counts not ok pep_tutorials_nlevp-damped_beam_1_jd # Error code: 14 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40938] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f84e17000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40941] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40941] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-loaded_string_1.counts not ok pep_tutorials_nlevp-loaded_string_1 # Error code: 14 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40968] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f85947000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-loaded_string_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-planar_waveguide_1.counts not ok pep_tutorials_nlevp-planar_waveguide_1+pep_type-toar # Error code: 14 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40998] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f961b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41001] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41001] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-planar_waveguide_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-planar_waveguide_1+pep_type-linear # Error code: 14 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41015] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f959ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41018] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41018] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-planar_waveguide_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_1.counts not ok pep_tutorials_nlevp-sleeper_1+pep_type-toar # Error code: 14 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41045] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf735000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41048] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41048] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-sleeper_1+pep_type-linear # Error code: 14 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41062] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f99337000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41065] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41065] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_1_qarnoldi.counts not ok pep_tutorials_nlevp-sleeper_1_qarnoldi # Error code: 14 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41092] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8b016000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41095] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41095] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_1_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_2_toar.counts not ok pep_tutorials_nlevp-sleeper_2_toar # Error code: 14 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41122] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f82486000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41125] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41125] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_2_toar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_2_jd.counts not ok pep_tutorials_nlevp-sleeper_2_jd # Error code: 14 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41152] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8422d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41155] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41155] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_2_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_3.counts not ok pep_tutorials_nlevp-sleeper_3 # Error code: 14 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41182] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb5179000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41185] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41185] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_4.counts not ok pep_tutorials_nlevp-sleeper_4 # Error code: 14 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41212] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f88602000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41215] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41215] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1.counts not ok pep_tutorials_nlevp-spring_1+pep_type-toar # Error code: 14 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41242] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f80c89000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41245] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41245] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-spring_1+pep_type-linear # Error code: 14 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41259] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb275a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41262] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41262] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1_stoar.counts not ok pep_tutorials_nlevp-spring_1_stoar # Error code: 14 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fba3df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41292] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41292] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1_qarnoldi.counts not ok pep_tutorials_nlevp-spring_1_qarnoldi # Error code: 14 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41319] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa04c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_2.counts not ok pep_tutorials_nlevp-spring_2 # Error code: 14 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41349] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb631f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41352] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41352] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_3.counts not ok pep_tutorials_nlevp-spring_3 # Error code: 14 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41379] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83353000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_4.counts not ok pep_tutorials_nlevp-spring_4 # Error code: 14 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41409] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9133c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41412] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41412] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_5.counts not ok pep_tutorials_nlevp-spring_5 # Error code: 14 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41439] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fba658000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_6.counts not ok pep_tutorials_nlevp-spring_6 # Error code: 14 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41469] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8f62e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41472] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41472] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1.counts not ok pep_tutorials_nlevp-wiresaw_1+pep_type-toar # Error code: 14 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41499] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9419b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41502] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41502] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_1+pep_type-qarnoldi # Error code: 14 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41516] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbcdff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41519] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41519] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_h1.counts not ok pep_tutorials_nlevp-wiresaw_1_linear_h1 # Error code: 14 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fab648000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41549] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41549] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1_linear_h1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_h2.counts not ok pep_tutorials_nlevp-wiresaw_1_linear_h2 # Error code: 14 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f85d56000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1_linear_h2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_other.counts not ok pep_tutorials_nlevp-wiresaw_1_linear_other # Error code: 14 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41606] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa0e29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41609] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41609] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1_linear_other # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2.counts not ok pep_tutorials_nlevp-wiresaw_2+pep_type-toar # Error code: 14 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41636] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faa63a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41639] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41639] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_2+pep_type-qarnoldi # Error code: 14 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41653] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8f97e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41656] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41656] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2_linear.counts not ok pep_tutorials_nlevp-wiresaw_2_linear+pep_linear_linearization-1,0 # Error code: 14 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fab6ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41686] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41686] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2_linear # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_2_linear+pep_linear_linearization-0,1 # Error code: 14 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41700] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbb87c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2_linear_other.counts not ok pep_tutorials_nlevp-wiresaw_2_linear_other # Error code: 14 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41730] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9e85e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41733] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41733] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2_linear_other # SKIP Command failed so no diff RM test-rm-pep.F90 TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test1_1_real.counts not ok nep_tests-test1_1_real+nep_type-rii # Error code: 14 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95f07000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41764] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41764] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test1_1_real # SKIP Command failed so no diff not ok nep_tests-test1_1_real+nep_type-slp # Error code: 14 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41778] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f831a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41781] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41781] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test1_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test1_3_real.counts not ok nep_tests-test1_3_real # Error code: 14 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41808] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8c4ea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41811] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41811] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test1_3_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1.counts not ok nep_tests-test10_1 # Error code: 14 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb6c3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41838] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41842] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41841] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41841] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41842] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41838@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_rii.counts not ok nep_tests-test10_1_rii+split-0 # Error code: 14 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faaaab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41875] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41874] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41875] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41874] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41871@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_rii # SKIP Command failed so no diff not ok nep_tests-test10_1_rii+split-1 # Error code: 14 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa572a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41891] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41894] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41895] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41894] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41895] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41891@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_rii # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_narnoldi.counts not ok nep_tests-test10_1_narnoldi # Error code: 14 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41924] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f89ba1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41928] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41928] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41927] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41924@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_narnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_slp.counts not ok nep_tests-test10_1_slp+split-0 # Error code: 14 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa581e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41961] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41961] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41957@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_slp # SKIP Command failed so no diff not ok nep_tests-test10_1_slp+split-1 # Error code: 14 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa6038000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41980] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41981] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41980] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41981] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41977@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_slp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_interpol.counts not ok nep_tests-test10_1_interpol # Error code: 14 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b84b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42013] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42014] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42013] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no*** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42014] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42010@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_interpol # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_narnoldi_sync.counts not ok nep_tests-test10_1_narnoldi_sync # Error code: 14 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f85755000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42043] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42047] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42046] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42046] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:42047] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42043@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_narnoldi_sync # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_2_interpol.counts not ok nep_tests-test10_2_interpol # Error code: 14 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42076] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa09cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_interpol # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_2_nleigs_real.counts not ok nep_tests-test10_2_nleigs_real+split-0 # Error code: 14 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42106] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb5f0d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42109] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42109] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_nleigs_real # SKIP Command failed so no diff not ok nep_tests-test10_2_nleigs_real+split-1 # Error code: 14 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42123] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa7508000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42126] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42126] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_nleigs_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test12_1.counts not ok nep_tests-test12_1 # Error code: 14 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42153] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f93eec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42156] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42156] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test13_1.counts not ok nep_tests-test13_1 # Error code: 14 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb4fa9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42186] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42186] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test14_1.counts not ok nep_tests-test14_1 # Error code: 14 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42213] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8a93d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42216] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42216] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test15_1.counts not ok nep_tests-test15_1 # Error code: 14 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42243] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f825aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42246] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42246] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test16_1.counts not ok nep_tests-test16_1 # Error code: 14 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42273] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a644000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42276] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42276] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_1.counts not ok nep_tests-test17_1+nep_two_sided-0_split-0 # Error code: 14 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42303] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb7d6c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42306] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42306] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_1 # SKIP Command failed so no diff not ok nep_tests-test17_1+nep_two_sided-0_split-1 # Error code: 14 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f95a8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_1 # SKIP Command failed so no diff not ok nep_tests-test17_1+nep_two_sided-1_split-0 # Error code: 14 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42337] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f875df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42340] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42340] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_1 # SKIP Command failed so no diff not ok nep_tests-test17_1+nep_two_sided-1_split-1 # Error code: 14 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42354] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa8ba8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42357] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42357] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_2_interpol.counts not ok nep_tests-test17_2_interpol # Error code: 14 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42384] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb5022000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42387] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42387] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_2_interpol # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_2_nleigs_real.counts not ok nep_tests-test17_2_nleigs_real+split-0 # Error code: 14 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42414] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f86823000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42417] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42417] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test17_2_nleigs_real # SKIP Command failed so no diff not ok nep_tests-test17_2_nleigs_real+split-1 # Error code: 14 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42431] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f90c85000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42434] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42434] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_2_nleigs_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test2_1.counts not ok nep_tests-test2_1 # Error code: 14 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbe66a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42464] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42464] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test3_1.counts not ok nep_tests-test3_1 # Error code: 14 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42491] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa560e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test3_1_ts.counts not ok nep_tests-test3_1_ts # Error code: 14 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42521] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb99aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42524] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42524] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test3_1_ts # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test4_1.counts not ok nep_tests-test4_1 # Error code: 14 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb8823000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42554] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42554] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test5_1.counts not ok nep_tests-test5_1 # Error code: 14 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42581] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb6b18000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42584] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42584] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test5_3.counts not ok nep_tests-test5_3 # Error code: 14 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb37aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42614] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42614] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test5_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test6_1.counts not ok nep_tests-test6_1 # Error code: 14 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42641] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbb7dc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42644] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42644] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test7_1.counts not ok nep_tests-test7_1+nsize-1 # Error code: 14 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42671] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbc832000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42674] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42674] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test7_1 # SKIP Command failed so no diff not ok nep_tests-test7_1+nsize-2 # Error code: 14 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f973f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42688] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42691] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42692] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42692] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42691] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42688@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test7_2.counts not ok nep_tests-test7_2+nsize-1 # Error code: 14 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42721] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f81def000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test7_2 # SKIP Command failed so no diff not ok nep_tests-test7_2+nsize-2 # Error code: 14 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42738] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fadddd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42741] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42742] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42741] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:42742] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42738@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_1.counts not ok nep_tests-test8_1 # Error code: 14 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa65ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42774] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42774] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_2.counts not ok nep_tests-test8_2 # Error code: 14 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42801] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8ca08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_3.counts not ok nep_tests-test8_3 # Error code: 14 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42831] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9e10a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42834] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42834] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_4.counts not ok nep_tests-test8_4 # Error code: 14 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f87b5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42864] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42864] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test9_1.counts not ok nep_tests-test9_1 # Error code: 14 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42891] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f98510000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42894] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42894] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_1.counts not ok nep_tutorials-ex20_1 # Error code: 14 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b075000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_2.counts not ok nep_tutorials-ex20_2 # Error code: 14 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbd2e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_3.counts not ok nep_tutorials-ex20_3+nep_two_sided-0 # Error code: 14 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42981] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbc8f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_3 # SKIP Command failed so no diff not ok nep_tutorials-ex20_3+nep_two_sided-1 # Error code: 14 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42998] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faabe0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43001] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43001] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex20_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_4.counts not ok nep_tutorials-ex20_4 # Error code: 14 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43028] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb3024000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex21_1_rii.counts not ok nep_tutorials-ex21_1_rii+nsize-1 # Error code: 14 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43058] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbc867000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex21_1_rii # SKIP Command failed so no diff not ok nep_tutorials-ex21_1_rii+nsize-2 # Error code: 14 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43075] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2ca4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43078] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43078] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:43079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43075@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tutorials-ex21_1_rii # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex21_1_slp.counts not ok nep_tutorials-ex21_1_slp+nsize-1 # Error code: 14 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f84042000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex21_1_slp # SKIP Command failed so no diff not ok nep_tutorials-ex21_1_slp+nsize-2 # Error code: 14 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb1205000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43128] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43129] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43128] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:43129] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43125@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tutorials-ex21_1_slp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_1.counts not ok nep_tutorials-ex22_1+nep_type-rii # Error code: 14 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43158] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f90cbf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43161] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43161] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_1 # SKIP Command failed so no diff not ok nep_tutorials-ex22_1+nep_type-slp # Error code: 14 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9d22a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43178] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43178] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_1 # SKIP Command failed so no diff not ok nep_tutorials-ex22_1+nep_type-narnoldi # Error code: 14 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43192] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9bfe0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43195] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43195] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_2.counts not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-none # Error code: 14 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43222] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5fcb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43225] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43225] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_2 # SKIP Command failed so no diff not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-norm # Error code: 14 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43239] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f805ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43242] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43242] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_2 # SKIP Command failed so no diff not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-residual # Error code: 14 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa0bbb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43259] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43259] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3.counts not ok nep_tutorials-ex22_3+nep_type-rii # Error code: 14 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43286] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb6392000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43289] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43289] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3 # SKIP Command failed so no diff not ok nep_tutorials-ex22_3+nep_type-slp # Error code: 14 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43303] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f84a80000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43306] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43306] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3 # SKIP Command failed so no diff not ok nep_tutorials-ex22_3+nep_type-narnoldi # Error code: 14 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8f698000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_simpleu.counts not ok nep_tutorials-ex22_3_simpleu+nep_type-rii # Error code: 14 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43350] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f939e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff not ok nep_tutorials-ex22_3_simpleu+nep_type-slp # Error code: 14 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43367] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa9777000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff not ok nep_tutorials-ex22_3_simpleu+nep_type-narnoldi # Error code: 14 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43384] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8dc5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43387] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43387] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_slp_thres.counts not ok nep_tutorials-ex22_3_slp_thres # Error code: 14 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43414] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f97a68000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43417] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43417] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_slp_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_rii_thres.counts not ok nep_tutorials-ex22_3_rii_thres # Error code: 14 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43444] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9367a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43447] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43447] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_rii_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_4.counts not ok nep_tutorials-ex22_4 # Error code: 14 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43474] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83495000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_1.counts not ok nep_tutorials-ex27_1 # Error code: 14 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43504] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fac04f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43507] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43507] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_3.counts not ok nep_tutorials-ex27_3 # Error code: 14 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43534] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f99656000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43537] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43537] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_2.counts not ok nep_tutorials-ex27_2 # Error code: 14 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f94aa4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43567] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43567] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_4.counts not ok nep_tutorials-ex27_4 # Error code: 14 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43594] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8b669000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43597] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43597] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_9.counts not ok nep_tutorials-ex27_9 # Error code: 14 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84649000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43627] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43627] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex42_1.counts not ok nep_tutorials-ex42_1 # Error code: 14 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43654] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb86e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43657] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43657] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex42_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_1.counts not ok nep_tutorials_nlevp-loaded_string_1 # Error code: 14 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43684] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f90fce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43687] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43687] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_2.counts not ok nep_tutorials_nlevp-loaded_string_2+nep_refine_scheme-schur # Error code: 14 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43714] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7140000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43717] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43717] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_2 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_2+nep_refine_scheme-explicit # Error code: 14 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb6f0a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43734] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43734] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_2_mbe.counts not ok nep_tutorials_nlevp-loaded_string_2_mbe # Error code: 14 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa097f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43764] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43764] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_2_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_3_explicit.counts not ok nep_tutorials_nlevp-loaded_string_3_explicit # Error code: 14 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f96e8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43791] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43794] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43795] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43794] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43795] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43791@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_3_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_3_mbe.counts not ok nep_tutorials_nlevp-loaded_string_3_mbe # Error code: 14 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f954a8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43828] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43827] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43827] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43828] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43824@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_3_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_4.counts not ok nep_tutorials_nlevp-loaded_string_4 # Error code: 14 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbeec8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43861] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43860] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43862] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43863] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43862] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43860] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43861] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43863] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43857@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tutorials_nlevp-loaded_string_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_5.counts not ok nep_tutorials_nlevp-loaded_string_5 # Error code: 14 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43896] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e4e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_6.counts not ok nep_tutorials_nlevp-loaded_string_6 # Error code: 14 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43926] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb5f0f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43929] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43929] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_7.counts not ok nep_tutorials_nlevp-loaded_string_7 # Error code: 14 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43956] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f91be0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43959] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43959] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8.counts not ok nep_tutorials_nlevp-loaded_string_8+nep_type-rii # Error code: 14 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43986] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f96533000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43989] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43989] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_8+nep_type-slp # Error code: 14 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbd887000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_8+nep_type-narnoldi # Error code: 14 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44020] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9e5a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44023] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44023] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_rii_thres.counts not ok nep_tutorials_nlevp-loaded_string_8_rii_thres # Error code: 14 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44050] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9b311000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44053] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44053] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8_rii_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_slp_thres.counts not ok nep_tutorials_nlevp-loaded_string_8_slp_thres # Error code: 14 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44080] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95987000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44083] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44083] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8_slp_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_slp_two_thres.counts not ok nep_tutorials_nlevp-loaded_string_8_slp_two_thres # Error code: 14 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f86828000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44113] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44113] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8_slp_two_thres # SKIP Command failed so no diff RM test-rm-nep.F90 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test1_1.counts ok mfn_tests-test1_1 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test2_1.counts not ok mfn_tests-test2_1+mfn_type-krylov # Error code: 14 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44154] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa81c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44157] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44157] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test2_1 # SKIP Command failed so no diff not ok mfn_tests-test2_1+mfn_type-expokit # Error code: 14 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44171] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8dfbd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44174] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44174] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test2_3.counts not ok mfn_tests-test2_3 # Error code: 14 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44201] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8c014000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44204] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44204] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_1.counts not ok mfn_tests-test3_1 # Error code: 14 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44231] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f99eb8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_1_x.counts not ok mfn_tests-test3_1_x # Error code: 14 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44261] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f877f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44264] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44264] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test3_1_x # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_2.counts not ok mfn_tests-test3_2 # Error code: 14 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44291] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb1bc6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44294] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44294] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test4_1.counts not ok mfn_tests-test4_1 # Error code: 14 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44321] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fafa9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test5_1.counts not ok mfn_tests-test5_1 # Error code: 14 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44351] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f84665000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44354] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44354] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex23_1.counts not ok mfn_tutorials-ex23_1 # Error code: 14 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44381] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9f79e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44384] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44384] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex23_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex26_1.counts not ok mfn_tutorials-ex26_1 # Error code: 14 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44411] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84861000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44414] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44414] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex37_1.counts not ok mfn_tutorials-ex37_1 # Error code: 14 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44441] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faa752000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44444] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44444] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex37_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex39_1.counts not ok mfn_tutorials-ex39_1 # Error code: 14 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44471] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9d044000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex39_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex39_2.counts not ok mfn_tutorials-ex39_2 # Error code: 14 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44501] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd77e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44504] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44504] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex39_2 # SKIP Command failed so no diff RM test-rm-mfn.F90 TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_1.counts not ok lme_tests-test1_1 # Error code: 14 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f90c0b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_2.counts not ok lme_tests-test1_2 # Error code: 14 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44562] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa5ea4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44565] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44565] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_3.counts not ok lme_tests-test1_3 # Error code: 14 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44592] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9e1ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44595] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44595] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test1_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test2_1.counts not ok lme_tests-test2_1 # Error code: 14 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44622] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f952d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44625] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44625] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tutorials-ex32_1.counts not ok lme_tutorials-ex32_1 # Error code: 14 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44652] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f936ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44655] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44655] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tutorials-ex32_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tutorials-ex32_2.counts not ok lme_tutorials-ex32_2 # Error code: 14 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f907a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44685] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44685] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tutorials-ex32_2 # SKIP Command failed so no diff RM test-rm-eps.c TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials_cnetwork-embedgsvd_1.counts not ok svd_tutorials_cnetwork-embedgsvd_1 # Error code: 14 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44713] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb2677000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44716] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44716] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials_cnetwork-embedgsvd_1 # SKIP Command failed so no diff RM test-rm-pep.c RM test-rm-nep.c RM test-rm-mfn.c RM test-rm-lme.c E: Build killed with signal TERM after 150 minutes of inactivity -------------------------------------------------------------------------------- Build finished at 2026-01-31T16:03:27Z Finished -------- +------------------------------------------------------------------------------+ | Cleanup Sat, 31 Jan 2026 16:03:28 +0000 | +------------------------------------------------------------------------------+ Purging /build/reproducible-path Not cleaning session: cloned chroot in use E: Build failure (dpkg-buildpackage died with exit 143) +------------------------------------------------------------------------------+ | Summary Sat, 31 Jan 2026 16:03:35 +0000 | +------------------------------------------------------------------------------+ Build Architecture: riscv64 Build Type: any Build-Space: 875464 Build-Time: 10808 Distribution: unstable Fail-Stage: build Host Architecture: riscv64 Install-Time: 18 Job: /srv/rebuilderd/tmp/rebuilderdnZCy1O/inputs/slepc_3.24.2+dfsg1-1.dsc Machine Architecture: riscv64 Package: slepc Package-Time: 10890 Source-Version: 3.24.2+dfsg1-1 Space: 875464 Status: attempted Version: 3.24.2+dfsg1-1 -------------------------------------------------------------------------------- Finished at 2026-01-31T16:03:27Z Build needed 03:01:30, 875464k disk space E: Build failure (dpkg-buildpackage died with exit 143) sbuild failed