Published in


Hadoop Hive MySQL on Ubuntu 20.04


Overall Step

Step 1: Set

Copy template file to the file

cp conf/ conf/

Then go inside to edit the Hadoop path

# Set HADOOP_HOME to point to a specific hadoop install directory

Step 2: Set hive-site.xml

First of all you have to copy template to be hive-site.xml

cp conf/hive-site.xml.template conf/hive-site.xml

Then sneak and peek to edit the following value. This will set hive to use MySQL.

JDBC connect string for a JDBC metastore.
To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
<description>Driver class name for a JDBC metastore</description>
<description>password to use against metastore database</description>
<description>Username to use against metastore database</description>

Step 3: MySQL set up

Assume MySQL is already up and running. Now we create user:

CREATE USER 'hiveuser'@'localhost' IDENTIFIED BY 'Hive111!!!';
GRANT ALL PRIVILEGES ON *.* TO 'hiveuser'@'localhost';

Note that this is password for Hive metastore not Hive session.

This should match the value in the Step 2

Step 4: MySQL JDBC Connector

Download MySQL Connector and Copy Jar to Hive folder

The download can be found here:

Then wget to specific location and copy to the Hive path.

cp mysql-connector-java-8.0.28.jar /home/hadoop/hive/lib/

I try setting CLASSPATH in several location but it doesn’t seem to work.

So please copy.

Step 5: Hive Schema Initial Tool

schematool  -initSchema  -dbType mysql

You must see Hive table similar to the output below

Step 6: Create hive default location

hdfs dfs -mkdir /tmp
hdfs dfs -mkdir -p /user/hive/warehouse
hdfs dfs -chmod g+w /tmp
hdfs dfs -chmod g+w /user/hive/warehouse

Step 7: Check point with Hive command line connector

Now you run the command here


There might occurs issues as below:

Possible Issue #1 : Illegal character

Line 3218

Then you can see &#8 which produce the error

Possible Issue #2: ClassCastException

This is because Hive is based on JDK8 not JDK11 yet
References here:

Downgrade to JDK 8 then don’t forget to change environment variable both in Hadoop and Hive

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

You can find you JAVA_HOMEby readlink -f $(which java)

Possible Issue #3: tmpdir Relative path in absolute URI

Add below code to beginning of the hive-site.xml file



Possible Warning: SLF4J Class path contains multiple SLF4J bindings.

You can ignore this but if it is annoying you can resolve be remove the duplicated JAR.

Remove lib/log4j-slf4j-impl-2.10.0.jar in the Hive directory


Note: I only remove slf4j but in above thread they suggest remove 2 jar.

Step 8: Running Hive Server

hive --service metastore &
hive --service hiveserver2 &

This is run as background. If everything looks good you can disown it to prevent it to shutdown when your SSH session ends.

[1] ... metastore
[2] ... hiveserver2
disown %1
disown %2

Step 9: Connecting to Hive

Don’t forget to change port to 10001 since the default hive is on 10000.

It will take some time after start hive server to be able to get connected.

Possible Issues: Required field ‘serverProtocolVersion’ is unset!

Go back to the server, kill all hive process.

Edit the hive-site.xml with the following property:


Run step 7 again.

Finally we’re done

I wish this would be easier.

I’ve done this 3 times already but it seems painful every time.

So I write this to my future self if I have to do it again and also for people on the internet.

Hope this help you save some time.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store