HadoopExam Learning Resources

HadoopExam Training, Interview Questions, Certifications, Projects, POC and Hands On exercise access

    40000+ Learners upgraded/switched career    Testimonials

How to create permanent tables in spark-sql

I'm creating tables in spark using following commands but these tables will be available only for that session.. can any one please tell me how to create permanent tables in spark-sql which will be available for all session ... 

case class Person(name :String,age:Int)

val people=sc.textFile("person.txt").map(_.split(",")).map(p=>Person(p(0),p(1).trim.toInt)).toDF()


val parquetFile = sqlContext.parquetFile("people.parquet")


val names = sqlContext.sql("SELECT name FROM parquetFile ")
names.map(t =>"Name: "+ t(0)).collect().foreach(println)
after executing these commands i'm able to see people.parquet file from different session also, But my requirement is to store parquetFile table permanently.. please suggest me how to do it..


you are not using spark-sql shell here , you are using spark.


use spark-sql shell (in the past called as Shark), refer below email.

ya.. im executing those command in scala console.. i'm not able to access spark-sql console.. When i run spark-sql command from terminal it says command not found.. can you tell me what will be reason for this..

try this.

step 1:- copy hive-site.xml file to /usr/lib/spark/conf/ (if you cant find these on your vm try to find using UNIX find command find . -name 'hive-site.xml' and also try to find 
find . -name 'spark')
step 2:-
cp/etc/hive/conf/hive-site.xml /usr/lib/spark/conf/
Step 3:-
/usr/lib/spark/bin/spark-sql--master local

Thanks for your replies.. i have copied hive-site.xml file to spark/conf and added hive jars to spark class path.. then also i'm not able to get spark-sql console .. given classpath in compute-classpath.sh file

Visit Home Page : http://hadoopexam.com for more detail . As you are not blacklisted user.

You are here: Home Question & Answer Hadoop Questions How to create permanent tables in spark-sql