[MacPorts] #53595: hadoop fails to install before and after selfupdate to 2.40
MacPorts
noreply at macports.org
Sat Feb 18 22:32:31 UTC 2017
#53595: hadoop fails to install before and after selfupdate to 2.40
------------------------------+-------------------
Reporter: TheLastLovemark | Owner:
Type: defect | Status: new
Priority: Normal | Milestone:
Component: ports | Version: 2.4.0
Resolution: | Keywords:
Port: hadoop |
------------------------------+-------------------
Comment (by TheLastLovemark):
I did this as root, so no need for sudo:
{{{
sh-3.2# cd
"/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1"
sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true
-Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/tools/ant/launch/Launcher : Unsupported major.minor version
52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
sh-3.2#
}}}
I tried your solution with ant using 1.6, 1.7 and 1.8. The latter gave
some interesting results, but ultimately failed...
{{{
sh-3.2# setJdk6
sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true
-Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/tools/ant/launch/Launcher : Unsupported major.minor version
52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
sh-3.2# setJdk7
sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true
-Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/tools/ant/launch/Launcher : Unsupported major.minor version
52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at
sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
sh-3.2# setJdk8
sh-3.2# ant compile-native compile-c++-libhdfs -Dcompile.native=true
-Dsnappy.prefix=/opt/local -Dcompile.c++=true -Dlibhdfs=true
Buildfile:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml
compile-native:
create-native-configure:
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] glibtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR,
'config'.
[exec] glibtoolize: copying file 'config/ltmain.sh'
[exec] glibtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to
configure.ac,
[exec] glibtoolize: and rerunning glibtoolize and aclocal.
[exec] glibtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in
Makefile.am.
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before
AC_USE_SYSTEM_EXTENSIONS
[exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded
from...
[exec] configure.ac:42: the top level
[exec] configure.ac:44: warning: AM_INIT_AUTOMAKE: two- and three-
arguments forms are deprecated. For more info, see:
[exec] configure.ac:44:
http://www.gnu.org/software/automake/manual/automake.html#Modernize-
AM_005fINIT_005fAUTOMAKE-invocation
[exec] configure.ac:41: installing 'config/compile'
[exec] configure.ac:44: installing 'config/missing'
[exec] Makefile.am:32: warning: shell echo $$OS_NAME | tr [A-Z]
[a-z]: non-POSIX variable name
[exec] Makefile.am:32: (probably a GNU make extension)
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] automake: warning: possible forward-incompatibility.
[exec] automake: At least a source file is in a subdirectory, but the
'subdir-objects'
[exec] automake: automake option hasn't been enabled. For now, the
corresponding output
[exec] automake: object file(s) will be placed in the top-level
directory. However,
[exec] automake: this behaviour will change in future Automake
versions: they will
[exec] automake: unconditionally cause object files to be placed in
the same subdirectory
[exec] automake: of the corresponding sources.
[exec] automake: You are advised to start using 'subdir-objects'
option throughout your
[exec] automake: project, to avoid future incompatibilities.
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/compress/snappy/SnappyDecompressor.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/security/getGroup.c' is in a subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/security/JniBasedUnixGroupsMapping.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c' is in
a subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/nativeio/file_descriptor.c' is in a
subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/nativeio/errno_enum.c' is in a subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am:43: warning: source file
'src/org/apache/hadoop/io/nativeio/NativeIO.c' is in a subdirectory,
[exec] Makefile.am:43: but option 'subdir-objects' is disabled
[exec] Makefile.am: installing 'config/depcomp'
ivy-download:
[get] Getting:
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivy-2.1.0.jar
[get] Not modified - so not downloaded
ivy-init-dirs:
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy/lib
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ivy/report
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 ::
http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file =
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivysettings.xml
ivy-resolve-common:
ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use
'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file =
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/ivy/ivysettings.xml
init:
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/tools
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/src
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/task
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/job
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/history
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/hdfs
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/datanode
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps/secondary
/WEB-INF
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/examples
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/ant
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/c++
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/classes
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/testjar
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/testshell
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/test/extraconf
[touch] Creating
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/null1998253588
[delete] Deleting:
/var/folders/zz/zyxvpxvq6csfxvn_n0000000000000/T/null1998253588
[copy] Copying 9 files to
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/webapps
[exec] svn: E155007:
'/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1'
is not a working copy
[exec] svn: E155007:
'/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1'
is not a working copy
[exec] src/saveVersion.sh: line 36: md5sum: command not found
[exec] xargs: md5sum: No such file or directory
record-parser:
compile-rcc-compiler:
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:472:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 29 source files to
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
[javac] warning: [options] bootstrap class path not set in conjunction
with -source 1.6
[javac] 1 warning
compile-core-classes:
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:496:
warning: 'includeantruntime' was not set, defaulting to
build.sysclasspath=last; set to false for repeatable builds
[javac] Compiling 446 source files to
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
[javac] warning: [options] bootstrap class path not set in conjunction
with -source 1.6
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:51:
warning: ResolverConfiguration is internal proprietary API and may be
removed in a future release
[javac] import sun.net.dns.ResolverConfiguration;
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:52:
warning: IPAddressUtil is internal proprietary API and may be removed in a
future release
[javac] import sun.net.util.IPAddressUtil;
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/http/HttpServer.java:248:
warning: [unchecked] unchecked call to put(K,V) as a member of the raw
type Map
[javac] logContext.getInitParams().put(
[javac] ^
[javac] where K,V are type-variables:
[javac] K extends Object declared in interface Map
[javac] V extends Object declared in interface Map
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:493:
warning: ResolverConfiguration is internal proprietary API and may be
removed in a future release
[javac] ResolverConfiguration.open().searchlist();
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:510:
warning: IPAddressUtil is internal proprietary API and may be removed in a
future release
[javac] if (IPAddressUtil.isIPv4LiteralAddress(host)) {
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:512:
warning: IPAddressUtil is internal proprietary API and may be removed in a
future release
[javac] byte[] ip = IPAddressUtil.textToNumericFormatV4(host);
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:514:
warning: IPAddressUtil is internal proprietary API and may be removed in a
future release
[javac] } else if (IPAddressUtil.isIPv6LiteralAddress(host)) {
[javac] ^
[javac]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/core/org/apache/hadoop/security/SecurityUtil.java:516:
warning: IPAddressUtil is internal proprietary API and may be removed in a
future release
[javac] byte[] ip = IPAddressUtil.textToNumericFormatV6(host);
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 9 warnings
[javac] Creating empty
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes/org/apache/hadoop/jmx
/package-info.class
[copy] Copying 1 file to
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/classes
compile-core-native:
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-x86_64-64/lib
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/zlib
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/compress/snappy
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-x86_64-64/src/org/apache/hadoop/io/nativeio
[mkdir] Created dir:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-x86_64-64/src/org/apache/hadoop/security
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionHeader.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionStrategy.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor_CompressionLevel.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor_CompressionHeader.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/snappy/org_apache_hadoop_io_compress_snappy_SnappyCompressor.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/compress/snappy/org_apache_hadoop_io_compress_snappy_SnappyDecompressor.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO_Stat.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/io/nativeio/org_apache_hadoop_io_nativeio_NativeIO_CachedUid.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsMapping.h]]
[javah] [Forcefully writing file
RegularFileObject[/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build/native
/Mac_OS_X-
x86_64-64/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsNetgroupMapping.h]]
[exec] checking for gcc... /usr/bin/clang
[exec] checking whether the C compiler works... yes
[exec] checking for C compiler default output file name... a.out
[exec] checking for suffix of executables...
[exec] checking whether we are cross compiling... no
[exec] checking for suffix of object files... o
[exec] checking whether we are using the GNU C compiler... yes
[exec] checking whether /usr/bin/clang accepts -g... yes
[exec] checking for /usr/bin/clang option to accept ISO C89... none
needed
[exec] checking whether /usr/bin/clang understands -c and -o
together... yes
[exec] checking for special C compiler options needed for large
files... no
[exec] checking for _FILE_OFFSET_BITS value needed for large files...
no
[exec] checking how to run the C preprocessor... /usr/bin/clang -E
[exec] checking for grep that handles long lines and -e...
/usr/bin/grep
[exec] checking for egrep... /usr/bin/grep -E
[exec] checking for ANSI C header files... yes
[exec] checking for sys/types.h... yes
[exec] checking for sys/stat.h... yes
[exec] checking for stdlib.h... yes
[exec] checking for string.h... yes
[exec] checking for memory.h... yes
[exec] checking for strings.h... yes
[exec] checking for inttypes.h... yes
[exec] checking for stdint.h... yes
[exec] checking for unistd.h... yes
[exec] checking minix/config.h usability... no
[exec] checking minix/config.h presence... no
[exec] checking for minix/config.h... no
[exec] checking whether it is safe to define __EXTENSIONS__... yes
[exec] checking for a BSD-compatible install...
/opt/local/bin/ginstall -c
[exec] checking whether build environment is sane... yes
[exec] checking for a thread-safe mkdir -p... /opt/local/bin/gmkdir
-p
[exec] checking for gawk... gawk
[exec] checking whether make sets $(MAKE)... yes
[exec] checking for style of include used by make... GNU
[exec] checking whether make supports nested variables... yes
[exec] checking dependency style of /usr/bin/clang... gcc3
[exec] checking for gcc... (cached) /usr/bin/clang
[exec] checking whether we are using the GNU C compiler... (cached)
yes
[exec] checking whether /usr/bin/clang accepts -g... (cached) yes
[exec] checking for /usr/bin/clang option to accept ISO C89...
(cached) none needed
[exec] checking whether /usr/bin/clang understands -c and -o
together... (cached) yes
[exec] checking build system type... x86_64-apple-darwin16.4.0
[exec] checking host system type... x86_64-apple-darwin16.4.0
[exec] checking how to print strings... printf
[exec] checking for a sed that does not truncate output...
/opt/local/bin/gsed
[exec] checking for fgrep... /usr/bin/grep -F
[exec] checking for ld used by /usr/bin/clang...
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld
[exec] checking if the linker
(/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld)
is GNU ld... no
[exec] checking for BSD- or MS-compatible name lister (nm)...
/opt/local/bin/nm -B
[exec] checking the name lister (/opt/local/bin/nm -B) interface...
BSD nm
[exec] checking whether ln -s works... yes
[exec] checking the maximum length of command line arguments...
196608
[exec] checking how to convert x86_64-apple-darwin16.4.0 file names
to x86_64-apple-darwin16.4.0 format... func_convert_file_noop
[exec] checking how to convert x86_64-apple-darwin16.4.0 file names
to toolchain format... func_convert_file_noop
[exec] checking for
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld
option to reload object files... -r
[exec] checking for objdump... objdump
[exec] checking how to recognize dependent libraries... pass_all
[exec] checking for dlltool... no
[exec] checking how to associate runtime and link libraries... printf
%s\n
[exec] checking for ar... ar
[exec] checking for archiver @FILE support... no
[exec] checking for strip... strip
[exec] checking for ranlib... ranlib
[exec] checking command to parse /opt/local/bin/nm -B output from
/usr/bin/clang object... ok
[exec] checking for sysroot... no
[exec] checking for a working dd... /bin/dd
[exec] checking how to truncate binary pipes... /bin/dd bs=4096
count=1
[exec] checking for mt... no
[exec] checking if : is a manifest tool... no
[exec] checking for dsymutil... dsymutil
[exec] checking for nmedit... nmedit
[exec] checking for lipo... lipo
[exec] checking for otool... otool
[exec] checking for otool64... no
[exec] checking for -single_module linker flag... yes
[exec] checking for -exported_symbols_list linker flag... yes
[exec] checking for -force_load linker flag... yes
[exec] checking for dlfcn.h... yes
[exec] checking for objdir... .libs
[exec] checking if /usr/bin/clang supports -fno-rtti -fno-
exceptions... yes
[exec] checking for /usr/bin/clang option to produce PIC... -fno-
common -DPIC
[exec] checking if /usr/bin/clang PIC flag -fno-common -DPIC works...
yes
[exec] checking if /usr/bin/clang static flag -static works... no
[exec] checking if /usr/bin/clang supports -c -o file.o... yes
[exec] checking if /usr/bin/clang supports -c -o file.o... (cached)
yes
[exec] checking whether the /usr/bin/clang linker
(/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld)
supports shared libraries... yes
[exec] checking dynamic linker characteristics... darwin16.4.0 dyld
[exec] checking how to hardcode library paths into programs...
immediate
[exec] checking whether stripping libraries is possible... yes
[exec] checking if libtool supports shared libraries... yes
[exec] checking whether to build shared libraries... yes
[exec] checking whether to build static libraries... yes
[exec] checking for dlopen in -ldl... yes
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
[exec] checking for ANSI C header files... (cached) yes
[exec] checking stdio.h usability... yes
[exec] checking stdio.h presence... yes
[exec] checking for stdio.h... yes
[exec] checking stddef.h usability... yes
[exec] checking stddef.h presence... yes
[exec] checking for stddef.h... yes
[exec] checking jni.h usability... yes
[exec] checking jni.h presence... yes
[exec] checking for jni.h... yes
[exec] checking zlib.h usability... yes
[exec] checking zlib.h presence... yes
[exec] checking for zlib.h... yes
[exec] checking Checking for the 'actual' dynamic-library for
'-lz'...
[exec] checking zconf.h usability... yes
[exec] checking zconf.h presence... yes
[exec] checking for zconf.h... yes
[exec] checking Checking for the 'actual' dynamic-library for
'-lz'... (cached)
[exec] checking snappy-c.h usability... yes
[exec] checking snappy-c.h presence... yes
[exec] checking for snappy-c.h... yes
[exec] checking Checking for the 'actual' dynamic-library for
'-lsnappy'... libnotfound.so
[exec] checking fcntl.h usability... yes
[exec] checking fcntl.h presence... yes
[exec] checking for fcntl.h... yes
[exec] checking for stdlib.h... (cached) yes
[exec] checking for string.h... (cached) yes
[exec] checking for unistd.h... (cached) yes
[exec] checking for fcntl.h... (cached) yes
[exec] checking for posix_fadvise... no
[exec] checking for fcntl.h... (cached) yes
[exec] checking for sync_file_range... no
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for memset... yes
[exec] checking whether strerror_r is declared... yes
[exec] checking for strerror_r... yes
[exec] checking whether strerror_r returns char *... no
[exec] checking that generated files are newer than configure... done
[exec] configure: creating ./config.status
[exec] config.status: creating Makefile
[exec] config.status: creating config.h
[exec] config.status: executing depfiles commands
[exec] config.status: executing libtool commands
[exec] /Applications/Xcode.app/Contents/Developer/usr/bin/make all-
am
[exec] /bin/sh ./libtool --tag=CC --mode=compile /usr/bin/clang
-DHAVE_CONFIG_H -I.
-I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native
-I/System/Library/Frameworks/JavaVM.framework/Headers
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin
-I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src
-Isrc/org/apache/hadoop/io/compress/zlib
-Isrc/org/apache/hadoop/io/compress/snappy
-Isrc/org/apache/hadoop/io/nativeio -Isrc/org/apache/hadoop/security
-I/opt/local/include -I/System/Library/Frameworks/JavaVM.framework/Headers
-I/System/Library/Frameworks/JavaVM.framework/Headers
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin
-g -Wall -fPIC -O2 -m64 -Os -arch x86_64 -MT ZlibCompressor.lo -MD -MP -MF
.deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f
'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo
'/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
[exec] libtool: compile: /usr/bin/clang -DHAVE_CONFIG_H -I.
-I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native
-I/System/Library/Frameworks/JavaVM.framework/Headers
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin
-I/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src
-Isrc/org/apache/hadoop/io/compress/zlib
-Isrc/org/apache/hadoop/io/compress/snappy
-Isrc/org/apache/hadoop/io/nativeio -Isrc/org/apache/hadoop/security
-I/opt/local/include -I/System/Library/Frameworks/JavaVM.framework/Headers
-I/System/Library/Frameworks/JavaVM.framework/Headers
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include
-I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin
-g -Wall -fPIC -O2 -m64 -Os -arch x86_64 -MT ZlibCompressor.lo -MD -MP -MF
.deps/ZlibCompressor.Tpo -c
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
-fno-common -DPIC -o .libs/ZlibCompressor.o
[exec]
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:71:41:
error: expected expression
[exec] void *libz = dlopen(HADOOP_ZLIB_LIBRARY, RTLD_LAZY |
RTLD_GLOBAL);
[exec] ^
[exec] 1 error generated.
[exec] make[1]: *** [ZlibCompressor.lo] Error 1
[exec] make: *** [all] Error 2
BUILD FAILED
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:627:
The following error occurred while executing this line:
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_hadoop/hadoop/work/hadoop-1.2.1/build.xml:707:
exec returned: 2
Total time: 1 minute 27 seconds
sh-3.2#
}}}
--
Ticket URL: <https://trac.macports.org/ticket/53595#comment:22>
MacPorts <https://www.macports.org/>
Ports system for macOS
More information about the macports-tickets
mailing list