其他
Spark on Hive & Hive on Spark,傻傻分不清楚
Spark on hive
Hive on Spark
Configuration of Hive is done by placing your hive-site.xml, core-site.xml (for security configuration), and hdfs-site.xml (for HDFS configuration) file in conf/.
发车
一、Hive开启MetaStore服务
<1>修改 hive/conf/hive-site.xml 新增如下配置
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
</property>
<property>
<name>hive.metastore.local</name>
<value>false</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://node01:9083</value>
</property>
</configuration>
<2>后台启动 Hive MetaStore服务
nohup /export/servers/hive/bin/hive --service metastore 2>&1 >> /var/log.log &
hive-site.xml 元数据仓库的位置等信息core-site.xml 安全相关的配置hdfs-site.xml HDFS 相关的配置
cp /export/servers/hive-1.1.0-cdh5.14.0/conf/hive-site.xml /export/servers/spark/conf
cp /export/servers/hadoop-2.6.0-cdh5.14.0/etc/hadoop/core-site.xml /export/servers/spark/conf
cp /export/servers/hadoop-2.6.0-cdh5.14.0/etc/hadoop/hdfs-site.xml /export/servers/spark/conf
飙车
使用SparkSQL操作Hive表
import org.apache.spark.sql.SparkSession
object HiveSupport {
def main(args: Array[String]): Unit = {
//创建sparkSession
val spark = SparkSession
.builder()
.appName("HiveSupport")
.master("local[*]")
.config("spark.sql.warehouse.dir", "hdfs://node01:8020/user/hive/warehouse")
.config("hive.metastore.uris", "thrift://node01:9083")
.enableHiveSupport()//开启hive语法的支持
.getOrCreate()
// 设置日志级别
spark.sparkContext.setLogLevel("WARN")
//查看有哪些表
spark.sql("show tables").show()
//创建表
spark.sql("CREATE TABLE person (id int, name string, age int) row format delimited fields terminated by ' '")
//加载数据,数据为当前SparkDemo项目目录下的person.txt(和src平级)
spark.sql("LOAD DATA LOCAL INPATH 'in/person.txt' INTO TABLE person")
//查询数据
spark.sql("select * from person ").show()
spark.stop()
}
}
hive (default)> show tables;
OK
tab_name
student
techer
techer2
Time taken: 0.738 seconds, Fetched: 3 row(s)
hive (default)>
文章不错?点个【在看】吧! 👇