Hive 0.13.1 on Hadoop2.2.0 + Oracle10g部署详解

java version "1.7.0_60"

Oracle10g

到以下地址下载安装包

安装包解压到服务器上

/home/fulong/Hive/apache-hive-0.13.1-bin

修改环境变量,添加以下内容

export HIVE_HOME=/home/fulong/Hive/apache-hive-0.13.1-bin

export PATH=$HIVE_HOME/bin:$PATH

进到conf目录下拷贝模板配置文件重命名

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin/conf$ ls

hive-default.xml.template  hive-exec-log4j.properties.template

hive-env.sh.template       hive-log4j.properties.template

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin/conf$ cp hive-env.sh.template hive-env.sh

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin/conf$ cp hive-default.xml.template hive-site.xml

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin/conf$ ls

hive-default.xml.template  hive-env.sh.template                 hive-log4j.properties.template

hive-env.sh               hive-exec-log4j.properties.template  hive-site.xml

修改配置文件hive-env.sh中的以下几处,分别制定Hadoop的根目录,Hive的conf和lib目录

# Set HADOOP_HOME to point to a specific hadoop install directory

HADOOP_HOME=/home/fulong/Hadoop/hadoop-2.2.0

 

# Hive Configuration Directory can be controlled by:

export HIVE_CONF_DIR=/home/fulong/Hive/apache-hive-0.13.1-bin/conf

 

# Folder containing extra ibraries required for hive compilation/execution can be controlled by:

export HIVE_AUX_JARS_PATH=/home/fulong/Hive/apache-hive-0.13.1-bin/lib

修改配置文件hive-site.sh中的以下几处连接Oracle相关参数

<property>

  <name>javax.jdo.option.ConnectionURL</name>

  <value>jdbc:oracle:thin:@192.168.0.138:1521:orcl</value>

  <description>JDBC connect string for a JDBC metastore</description>

</property>

 

<property>

  <name>javax.jdo.option.ConnectionDriverName</name>

  <value>oracle.jdbc.driver.OracleDriver</value>

  <description>Driver class name for a JDBC metastore</description>

</property>

 

<property>

  <name>javax.jdo.option.ConnectionUserName</name>

  <value>hive</value>

  <description>username to use against metastore database</description>

</property>

 

<property>

  <name>javax.jdo.option.ConnectionPassword</name>

  <value>hivefbi</value>

  <description>password to use against metastore database</description>

</property>

 

配置log4j

在$HIVE_HOME下创建log4j目录,用于存储日志文件

拷贝模板重命名

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin/conf$ cp hive-log4j.properties.template hive-log4j.properties

 

修改存放日志的目录

hive.log.dir=/home/fulong/Hive/apache-hive-0.13.1-bin/log4j

拷贝Oracle JDBC的jar包

将对应Oracle的jdbc包拷贝到$HIVE_HOME/lib下

启动Hive

fulong@FBI006:~/Hive/apache-hive-0.13.1-bin$ hive

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize

14/08/20 17:14:05 INFO Configuration.deprecation: mapred.committer.job.setup.cleanup.needed is deprecated. Instead, use mapreduce.job.committer.setup.cleanup.needed

14/08/20 17:14:05 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead

 

Logging initialized using configuration in file:/home/fulong/Hive/apache-hive-0.13.1-bin/conf/hive-log4j.properties

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:https://www.heiqu.com/25a4052451e30911cd1177be0929a666.html