admin管理员组文章数量:1550528
问题如题,详细报错如下
Caused by: java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException Encountered exception determining schema. Returning signal schema to indicate problem: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expecting closing '"' for name
2020/11/11 14:00:19 - at [Source: java.io.StringReader@50bf3cc6; line: 1, column: 6001])
2020/11/11 14:00:19 -
2020/11/11 14:00:19 - at org.pentaho.di.core.database.Database.execStatement(Database.java:1570)
2020/11/11 14:00:19 - at org.pentaho.di.core.database.Database.execStatement(Database.java:1518)
2020/11/11 14:00:19 - at org.pentaho.big.data.kettle.plugins.hive.trans.HiveOutput.loadTempToTable(HiveOutput.java:404)
2020/11/11 14:00:19 - at org.pentaho.big.data.kettle.plugins.hive.trans.HiveOutput.processRow(HiveOutput.java:73)
2020/11/11 14:00:19 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2020/11/11 14:00:19 - at java.lang.Thread.run(Thread.java:745)
2020/11/11 14:00:19 - Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException Encountered exception determining schema. Returning signal schema to indicate problem: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expecting closing '"' for name
2020/11/11 14:00:19 - at [Source: java.io.StringReader@50bf3cc6; line: 1, column: 6001])
2020/11/11 14:00:19 - at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:262)
2020/11/11 14:00:19 - at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:248)
2020/11/11 14:00:19 - at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:297)
2020/11/11 14:00:19 - at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:238)
2020/11/11 14:00:19 - at sun.reflect.GeneratedMethodAccessor74.invoke(Unknown Source)
2020/11/11 14:00:19 - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2020/11/11 14:00:19 - at java.lang.reflect.Method.invoke(Method.java:498)
2020/11/11 14:00:19 - at org.pentaho.hadoop.shimmon.DriverProxyInvocationChain$CaptureResultSetInvocationHandler.invoke(DriverProxyInvocationChain.java:596)
2020/11/11 14:00:19 - at com.sun.proxy.$Proxy58.execute(Unknown Source)
2020/11/11 14:00:19 - at org.pentaho.di.core.database.Database.execStatement(Database.java:1544)
2020/11/11 14:00:19 - ... 5 more
这个问题碰到好几次了,原因是因为hive元数据库table_params表的param_value字段字段不够,导致字段太多的表的schema被截断,在insert into执行时就会使用被截断的schema,于是就报出了上面的错误。
解决办法:
去大数据平台hive的配置里面找到元数据库的配置信息,然后连到元数据库,找到table_params表修改其字段param_value的类型,从varchar(4000)修改为longtext.然后重新建表,它的schema就不会被截断了。
记录下,方便后来人。
版权声明:本文标题:执行hive的insert into语句报错org.codehaus.jackson.JsonParseException: Unexpected end-of-input: was expectin 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:https://m.elefans.com/xitong/1727248673a1104955.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论