网站建设资讯

NEWS

网站建设资讯

HIVE操作的常见问题汇总

本篇内容主要讲解“HIVE操作的常见问题汇总”,感兴趣的朋友不妨来看看。本文介绍的方法操作简单快捷,实用性强。下面就让小编来带大家学习“HIVE操作的常见问题汇总”吧!

创新互联建站是一家专业提供石嘴山企业网站建设,专注与成都网站设计、成都网站建设、H5页面制作、小程序制作等业务。10年已为石嘴山众多企业、政府机构等服务。创新互联专业网络公司优惠进行中。

1、Terminal initialization failed; falling back to unsupported

[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
        at jline.TerminalFactory.create(TerminalFactory.java:101)
        at jline.TerminalFactory.get(TerminalFactory.java:158)
        at jline.console.ConsoleReader.(ConsoleReader.java:229)
        at jline.console.ConsoleReader.(ConsoleReader.java:221)
        at jline.console.ConsoleReader.(ConsoleReader.java:209)
        at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
        at jline.console.ConsoleReader.(ConsoleReader.java:230)
        at jline.console.ConsoleReader.(ConsoleReader.java:221)
        at jline.console.ConsoleReader.(ConsoleReader.java:209)
        at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

解决方案:

引起该问题的主要原因是因为Hadoop里面yarn的lib中含有和hive的lib目录中版本不一致的jline的jar包,只需要把hive中的jline包同步过去就可以了,先删除yarn的lib中的jline包,再复制

如:cp /u01/apache-hive-1.2.1-bin/lib/jline-2.12.jar /u01/hadoop-2.6.1/share/hadoop/yarn/lib

2、hive使用derby作为元数据库找不到所创建表的原因

很多初学者安装hive使用的是默认元数据库derby,这里面容易遇到下面问题:

安装好了hive 环境,建表后关闭连接窗口,再重新进表就不见了,这是哪里配置的不对?
1.命令行键入 hive后创建表,
2.在show tables; 显示表列表,上面显示刚才创建的表
3.把窗口关闭,再重新开个窗口连接服务器,键入hive ,然后show table,表看不到了

为什么会找不到创建的表,这个并没有消失,而是你的位置不对。

A.如果我们在home下面进入hive,那么我们的metastore_db,就会产生在home目录下。
B.如果我们在/usr目录下进入hive,那么我们的metastore_db,就会产生在usr目录下。

所以如果你更换了目录启动hive,那么之前的创建的表你是找不到的。

什么是不同路径下进入hive,例如

实验一:目录下创建表,查看表,不切换目录操作

查看当前目录,这里是/home/bdata/data目录

[bdata@bdata4 data]$ pwd

/home/bdata/data

查看内容,目前只有IY02_C.txt文件

[bdata@bdata4 data]$ ls

IY02_C.txt

执行hive命令,进入客户端

[bdata@bdata4 data]$ hive

Logging initialized using configuration in jar:file:/u01/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties
hive> show tables;
OK

Time taken: 0.64 seconds

#创建一个表结构,根据|分割

hive> CREATE TABLE YHJHK_IY02(AAB301 String,AAE043 String,AAB001 String,AAC001 String,AAC003 String,AAC004 String,AAC002 String,AAC005 String,AAC006 String,AAC009 String,AAC012 String,AAC028 String,AAC064 String,AAC087 String,AAC007 String,AAC032 String,AIC001 int, AIC090 int, AAC031 String,AAE160 String,AIC104 int, AIC164 int, AIC020 int, AAC095 int, AAC096 int, AAC097 int, AAC098 int, AAC099 int, AAC103 int, AAC104 int, AIC268 int, AIC115 int, AIC116 int, AIC117 int, AAB000 int, SUBJOB_ID int, VZE105 String, VZE003 String) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE;

#加载HDFS中的文件,这里/data/IY02_C.txt是我们提前存放到hdfs文件系统下的

hive> load data inpath '/data/IY02_C.txt' into table YHJHK_IY02;

#再次查看表,确定存在表yhjhk_iy02

hive> show tables;
OK
yhjhk_iy02

Time taken: 0.686 seconds, Fetched: 1 row(s)

#退出

hive> quit;

再次查看该目录,则多了两个文件metastore_db, derby.log

[bdata@bdata4 data]$ ls

derby.log  IY02_C.txt  metastore_db

实验二:切换目录,进入hive,查看表信息

回到主目录

[bdata@bdata4 data]$ cd

查看当前路径,确认目录已经切换

[bdata@bdata4 ~]$ pwd

/home/bdata

查看当前目录,确认是新目录,没有metastore_db和derby.log,只有一个data目录

[bdata@bdata4 ~]$ ls

data

进入hive客户端

[bdata@bdata4 ~]$ hive

Logging initialized using configuration in jar:file:/u01/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties

#查看表信息,这里查询出没有任何表信息

hive> show tables;
OK

Time taken: 0.64 seconds

hive> quit;

查看当前目录,已经生成了metastore_db和derby.log文件

[bdata@bdata4 ~]$ ls

data metastore_db derby.log

通过以上方式确定,我们在用derby的时候,如果突然执行hive查询表时,没有查询到数据,此时,不用担心,表数据并未消失,只需要切换到之前进行hive建表的目录执行客户端操作即可,或者提前切换为MySQL来存储原信息则不会出现该问题。

3、执行建表命令报错:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)

这个是由于我的mysql不再本地(默认使用本地数据库),这里需要配置远端元数据服务器,然后在hive服务端启动元数据存储服务,metastore用于和mysql之间的表结构创建或更新时通讯

首先在配置文件配置metastore的配置,如下

 

     hive.metastore.uris 

     thrift://192.168.10.34:9083

     jdbc/odbc connection hive,if mysql must set  

配置好后,启动通讯服务:

[bdata@bdata4 bin]$ ./hive --service metastore -hiveconf hbase.zookeeper.quorum=bdata1,bdata2,bdata3 -hiveconf hbase.zookeeper.property.clientPort=2181  &

4、执行HIVE命令报错:Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        ... 8 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        ... 14 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
        at org.apache.thrift.transport.TSocket.open(TSocket.java:187)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:86)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.ConnectException: Connection refused
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:579)
        at org.apache.thrift.transport.TSocket.open(TSocket.java:182)
        ... 22 more
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:236)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:74)

        ... 19 more

解决办法

同问题3解决办法一致

5、建表时,报错:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767 bytes

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes

解决办法:

修改数据库的字符集

登录数据库,进入HIVE数据库,删除所有的表

执行查询命令:

alter database hive character set latin1;

修改成功后,重启metastore,重新进入hive客户端执行创建表的操作即可成功

6.hiveserver2启动后,beeline不能连接的涉及的问题:
原因:权限问题
解决:
/user/hive/warehouse
/tmp
/history (如果配置了jobserver 那么/history也需要调整)
这三个目录,hive在运行时要读取写入目录里的内容,所以把权限放开,设置权限:

hadoop fs -chmod -R 777 /hive/tmp

hadoop fs -chmod -R 777 /hive/warehouse

7.beeline 链接拒绝报错信息
原因:官方的一个bug
解决:
hive.server2.long.polling.timeout

hive.server2.thrift.bind.host 注意把host改成自己的host

8、java.lang.IllegalArgumentException: java.net.UnknownHostException: dfscluster

解决办法:

找不到hdfs集群名字dfscluster,这个文件在HADOOP的etc/hadoop下面,有个文件hdfs-site.xml,复制到hive的conf下,重启即可

[bdata@bdata4 hadoop]$ cp  hdfs-site.xml /u01/apache-hive-1.2.1-bin/conf/

到此,相信大家对“HIVE操作的常见问题汇总”有了更深的了解,不妨来实际操作一番吧!这里是创新互联网站,更多相关内容可以进入相关频道进行查询,关注我们,继续学习!


分享标题:HIVE操作的常见问题汇总
网站链接:http://cdweb.net/article/pggpsp.html