前几天在测试Drill连接不同数据源的时候发现中文存在一些问题,就具体测试了一下,总的来说只有Mongodb有问题,不过hadoop系列需要转换一下。
Drill 中文测试总结:
配置修改:
vi apache-drill-1.0.0/conf/drill-env.sh
export DRILL_SHELL_JAVA_OPTS="-Dsaffron.default.charset=UTF-16LE"
不配置的错误信息:
Query Failed: An Error Occurred
org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: CalciteException: Failed to encode '用户' in character set 'ISO-8859-1' [Error Id: 886ec693-fbf3-4413-8c7a-f7161533de4a on sengtest:31010]
具体参考:
https://issues.apache.org/jira/browse/DRILL-4039
http://nagix.hatenablog.com/entry/2015/05/29/150215
测试语句示例:
select *from posgres.public.pet t where t.name = '用户';
注意也可以使用指定字符集函数
select *from posgres.public.pet t where t.name = _UTF16'用户';
测试结果:
postgres ok
oracle ok
mongodb 有问题
hdfs ok
hive ok
hbase ok
详细的测试语句
--postgres
select *from posgres.public.pet t where t.name = '用户' LIMIT 10;
--oracle
select *from oracle.BI_ADMIN.pet t where t.name = '用户' LIMIT 10;
--postgres join oracle
select a.name,c.name from oracle.BI_ADMIN.pet a join posgres.public.pet c on (a.name = c.name ) LIMIT 10;
--mongo
select *from mongo.scrapdb.weather_curr LIMIT 10;
--hive
select * from hive.test.trackingtable t where CONVERT_FROM(t.page_name, 'UTF8') = '后台系统' limit 10;
--hdfs
SELECT columns[0],columns[1] FROM hdfs.`/BASEDATA/MASTERDATA/test.csv` where columns[0] = '是是是' LIMIT 10;
--hbase
SELECT CONVERT_FROM(row_key, 'UTF8') AS studentid, CONVERT_FROM(students.account.name, 'UTF8') AS name FROM hbase.students where CONVERT_FROM(students.account.name, 'UTF8') = '爱丽丝' limit 10;