在Windows上直接使用JAVA API连接Hbase0.96报的一个异常,这个异常在Hbase0.94的版本里是没有的,为什么? 跟你所用的底层的Hadoop有关系,如果是底层hadoop是1.x的版本,那么没有这个问题,如果是2.x的hadoop,那么需要注意了,可能会出现下面这个问题,异常如下:
class="java">2014-07-14 13:27:59,286 WARN [org.apache.hadoop.util.NativeCodeLoader] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-07-14 13:27:59,317 ERROR [org.apache.hadoop.util.Shell] Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable D:\hadoop-1.2.0\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getStrings(Configuration.java:1514)
at org.apache.hadoop.hbase.zookeeper.ZKConfig.makeZKProps(ZKConfig.java:113)
at org.apache.hadoop.hbase.zookeeper.ZKConfig.getZKQuorumServersString(ZKConfig.java:265)
at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:159)
at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:134)
at org.apache.hadoop.hbase.client.ZooKeeperKeepAliveConnection.<init>(ZooKeeperKeepAliveConnection.java:43)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveZooKeeperWatcher(HConnectionManager.java:1710)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:82)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:806)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:366)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:247)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:188)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
at com.dhgate.ywhbase.test.QueryTest.main(QueryTest.java:43)
从上面的异常中国就可以看出,散仙使用的底层hadoop是1.x的环境变量,故报此异常了,不过这个异常不影响数据读取,但是为了不影响美观,我们还是需要把它处理掉,解决方法如下:
(1),比较简单的解决办法,在客户端上下载一个hadoop2.2的压缩包,解压,并配置环境变量把HADOOP_HOME改成2.x的,可以在程序里,设定,也可以直接在windows上的我的电脑右击里设置。
(2),无须下载hadoop2.2的压缩包,直接在代码里,判断是否有无winutils.exe 这个windows下的可执行文件,如果没有的话,就自己创建,这样一来,就不需要改hadoop1.x的环境变量了。
代码如下:
if (System.getProperty("os.name").contains("Windows")) {
File workaround = new File(".");
System.getProperties().put("hadoop.home.dir",workaround.getAbsolutePath());
File dir = new File("./bin");
if (!dir.exists()) {
dir.mkdirs();
}
File exe = new File("./bin/winutils.exe");
if (!exe.exists()) {
exe.createNewFile();
}
}