创建数据文件
1201,gopal,manager,50000,TP
1202,manisha,preader,50000,TP
1203,kalil,php dev,30000,AC
1204,prasanth,php dev,30000,AC
1205,kranthi,admin,20000,TP
1206,satish p,grp des,20000,GR
把数据文件上传到集群
在HDFS上创建目录
hadoop fs -mkdir -p /user/hadoop/emp
把数据文件上传到HDFS上
hadoop fs -put emp_data.txt /user/hadoop/emp
登录Mysql
mysql> create database userdb;
Query OK, 1 row affected (0.00 sec)
mysql> grant all privileges on userdb.* to 'sqoop'@'%' identified by 'sqoop';
Query OK, 0 rows affected (0.15 sec)
mysql> grant all privileges on userdb.* to 'sqoop'@'localhost' identified by 'sqoop';
Query OK, 0 rows affected (0.00 sec)
mysql> grant all privileges on userdb.* to 'sqoop'@'node1' identified by 'sqoop';
Query OK, 0 rows affected (0.00 sec)
mysql> flush privileges;
Query OK, 0 rows affected (0.01 sec)
mysql> use userdb
Database changed
mysql>
在mysql里创建一个表
CREATE TABLE employee (
id INT NOT NULL PRIMARY KEY,
name VARCHAR(20),
deg VARCHAR(20),
salary INT,
dept VARCHAR(10));
现在需要从HDFS上把数据导入到mysql中对应的表
执行命令
bin/sqoop export \
--connect jdbc:mysql://node1:3306/userdb \
--username sqoop \
--password sqoop \
--table employee \
--export-dir /user/hadoop/emp/
--input-fields-terminated-by ','
查看Mysql表的数据