Sqoop 1.99.3 설치
17002 단어 sqoop
1. :
hadoop hadoop 2.2.0
sqoop ( hadoop200)
http://www.us.apache.org/dist/sqoop/1.99.3/sqoop-1.99.3-bin-hadoop200.tar.gz
2. :
hadoop@hadoopMaster:$ sudo tar -xvf /opt/hn/hadoop_family/sqoop-1.99.3-bin-hadoop200.tar.gz
hadoop@hadoopMaster:mv /opt/hn/hadoop_family/sqoop-1.99.3-bin-hadoop200 /usr/local/sqoop
3. :
hadoop@hadoopMaster:~$ vim /etc/profile
:
#sqoop
export SQOOP_HOME=/usr/local/sqoop
export PATH=$SQOOP_HOME/bin:$PATH
export CATALINA_HOME=$SQOOP_HOME/server
export LOGDIR=$SQOOP_HOME/logs
:
source /etc/profile
4. sqoop :
hadoop@hadoopMaster:~$ vim /usr/local/sqoop/server/conf/sqoop.properties
# hadoop
org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/usr/local/hadoop/
# hadoop jar
hadoop@hadoopMaster:~$ vim /usr/local/sqoop/server/conf/catalina.properties
common.loader=/usr/local/hadoop/share/hadoop/common/*.jar,/usr/local/hadoop/share/hadoop/common/lib/*.jar,/usr/local/hadoop/share/hadoop/hdfs/*.jar,/usr/local/hadoop/share/hadoop/hdfs/lib/*.jar,/usr/local/hadoop/share/hadoop/mapreduce/*.jar,/usr/local/hadoop/share/hadoop/mapreduce/lib/*.jar,/usr/local/hadoop/share/hadoop/tools/*.jar,/usr/local/hadoop/share/hadoop/tools/lib/*.jar,/usr/local/hadoop/share/hadoop/yarn/*.jar,/usr/local/hadoop/share/hadoop/yarn/lib/*.jar,/usr/local/hadoop/share/hadoop/httpfs/tomcat/lib/*.jar
5. mysql
mysql-connector-java-5.1.16-bin.jar
6. / sqoop200
hadoop@hadoopMaster:/usr/local/sqoop/bin$ ./sqoop.sh server start/stop
:
hadoop@hadoopMaster:/usr/local/sqoop/server/logs$ vim catalina.out
7.
hadoop@hadoopMaster:/usr/local/sqoop/bin$ ./sqoop.sh client
+------------------------------------------+
|Sqoop home directory: /usr/local/sqoop |
|Sqoop Shell: Type 'help' or '\h' for help.|
|sqoop:000> |
+------------------------------------------+
:
+---------------------------------------------------------------------+
|sqoop:000> set server --host hadoopMaster --port 12000 --webapp sqoop|
|Server is set successfully |
+---------------------------------------------------------------------+
:
+-----------------------------------------------------------------+
|sqoop:000> show version --all |
|client version: |
| Sqoop 1.99.3 revision 2404393160301df16a94716a3034e31b03e27b0b |
| Compiled by mengweid on Fri Oct 18 14:15:53 EDT 2013 |
|server version: |
| Sqoop 1.99.3 revision 2404393160301df16a94716a3034e31b03e27b0b |
| Compiled by mengweid on Fri Oct 18 14:15:53 EDT 2013 |
|Protocol version: |
| [1] |
+-----------------------------------------------------------------+
:
+---------------------------------------------------------------------------------------------+
|sqoop:000> show connector --all |
|1 connector(s) to show: |
|Connector with id 1: |
| Name: generic-jdbc-connector |
| Class: org.apache.sqoop.connector.jdbc.GenericJdbcConnector |
| Version: 1.99.3 |
| Supported job types: [IMPORT, EXPORT] |
| Connection form 1: |
| Name: connection |
| Label: Connection configuration |
| Help: You must supply the information requested in order to create a connection object.|
| Input 1: |
| . |
| . |
| . |
| , |
+---------------------------------------------------------------------------------------------+
:
+---------------------------------------------------------------------------------------------+
|sqoop:000> create connection --cid 1 |
|Creating connection for connector with id 1 |
|Please fill following values to create new connection object |
|Name: My first |
| |
|Connection configuration |
| |
|JDBC Driver Class: com.mysql.jdbc.Driver |
|JDBC Connection String: jdbc:mysql://localhost:3306/sqoop_stu |
|Username: root |
|Password: ********** |
|JDBC Connection Properties: |
|There are currently 0 values in the map: |
|entry# |
| |
|Security related configuration options |
| |
|Max connections: 100 |
|New connection was successfully created with validation status FINE and persistent id 1 |
+---------------------------------------------------------------------------------------------+
+------------------------------------------------------------------------------------+
|sqoop:001> create job --xid 1 --type import |
|Creating job for connection with id 1 |
|Please fill following values to create new job object |
|Name: First job |
| |
|Database configuration |
| |
|Schema name: traceweb |
|Table name: trace_web_application |
|Table SQL statement: |
|Table column names: |
|Partition column name: |
|Nulls in partition column: |
|Boundary query: |
| |
|Output configuration |
| |
|Storage type: |
| 0 : HDFS |
|Choose: 0 |
|Output format: |
| 0 : TEXT_FILE |
| 1 : SEQUENCE_FILE |
|Choose: 1 |
|Compression format: |
| 0 : NONE |
| 1 : DEFAULT |
| 2 : DEFLATE |
| 3 : GZIP |
| 4 : BZIP2 |
| 5 : LZO |
| 6 : LZ4 |
| 7 : SNAPPY |
|Choose: 0 |
|Output directory: /opt/sqoop_output |
| |
|Throttling resources |
| |
|Extractors: |
|Loaders: |
|New job was successfully created with validation status FINE and persistent id 1 |
+------------------------------------------------------------------------------------+
job:
+------------------------------------------------
|sqoop:000> start job --jid 1
+------------------------------------------------
:
+------------------------------------------------
|sqoop:000> status job --jid 1
|Submission details
|Job ID: 1
|Server URL: http://hadoopMaster:12000/sqoop/
|Created by: hadoop
|Creation date: 2014-05-23 18:51:05 CST
|Lastly updated by: hadoop
|External ID: job_local1566994033_0001
| http://localhost:8080/
|2014-05-23 18:51:35 CST: UNKNOWN
+------------------------------------------------
:
+--------------------------------------------------------------------+
hadoop@hadoopMaster:~$ l /opt/sqoop_output/
92
drwxrwxr-x 2 hadoop hadoop 4096 5 23 18:52 .
drwxr-xr-x 8 hadoop hadoop 4096 5 23 18:51 ..
-rw-r--r-- 1 hadoop hadoop 209 5 23 18:51 part-m-00000.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00000.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00001.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00001.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00002.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00002.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00003.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00003.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00004.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00004.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00005.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00005.seq.crc
-rw-r--r-- 1 hadoop hadoop 207 5 23 18:51 part-m-00006.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00006.seq.crc
-rw-r--r-- 1 hadoop hadoop 86 5 23 18:51 part-m-00007.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00007.seq.crc
-rw-r--r-- 1 hadoop hadoop 206 5 23 18:51 part-m-00008.seq
-rw-rw-r-- 1 hadoop hadoop 12 5 23 18:51 .part-m-00008.seq.crc
-rw-r--r-- 1 hadoop hadoop 682 5 23 18:51 part-m-00009.seq
-rw-rw-r-- 1 hadoop hadoop 16 5 23 18:51 .part-m-00009.seq.crc
-rw-r--r-- 1 hadoop hadoop 0 5 23 18:51 _SUCCESS
-rw-rw-r-- 1 hadoop hadoop 8 5 23 18:51 ._SUCCESS.crc
+--------------------------------------------------------------------
sqoop:000> show job
+----+------------+--------+-----------+---------+
| Id | Name | Type | Connector | Enabled |
+----+------------+--------+-----------+---------+
| 1 | First job | IMPORT | 1 | true |
| 2 | importHDFS | IMPORT | 1 | true |
+----+------------+--------+-----------+---------+
sqoop:000> delete job --jid 1
sqoop:000> show job
+----+------------+--------+-----------+---------+
| Id | Name | Type | Connector | Enabled |
+----+------------+--------+-----------+---------+
| 2 | importHDFS | IMPORT | 1 | true |
+----+------------+--------+-----------+---------+
sqoop:000> delete job --jid 2
sqoop:000> show job
+----+------+------+-----------+---------+
| Id | Name | Type | Connector | Enabled |
+----+------+------+-----------+---------+
+----+------+------+-----------+---------+
sqoop:000> show connection
:
sqoop.sh client /opt/sqoop/script.sqoop
hadoop@hadoopMaster:$ vim /opt/sqoop/script.sqoop
#
set server --host hadoopMaster --port 12000 --webapp sqoop
# JOB
start job --jid 1
+--------------------------------------------------------------------+
hadoop@hadoopMaster:/usr/local/sqoop/bin$ ./sqoop.sh client /opt/hadoop/mysql/batchModel.sqoop
Sqoop home directory: /usr/local/sqoop
sqoop:000> set server --host hadoopMaster --port 12000 --webapp sqoop
Server is set successfully
sqoop:000> start job --jid 1
Submission details
Job ID: 1
Server URL: http://hadoopMaster:12000/sqoop/
Created by: hadoop
Creation date: 2014-05-30 10:55:10 CST
Lastly updated by: hadoop
External ID: job_local945860799_0003
http://localhost:8080/
2014-05-30 10:55:10 CST: BOOTING - Progress is not available
+--------------------------------------------------------------------+
https://cwiki.apache.org/confluence/display/SQOOP/Sqoop2+Quickstart#Sqoop2Quickstart-Fullimportdemo
================================MYSQL=======================================
hadoop@hadoopMaster:~$ mysql -uroot -pjava
mysql> create database sqoop_stu;
Query OK, 1 row affected (0.03 sec)
mysql> use sqoop_stu;
Database changed
mysql> create table student(id int(3) auto_increment not null primary key, name char(10) not null, address varchar(50));
Query OK, 0 rows affected (0.41 sec)
mysql> insert into student values(1, 'Tom','beijing'),(2, 'Joan','shanghai'), (3, 'Wang', 'shenzheng');
Query OK, 3 rows affected (0.07 sec)
Records: 3 Duplicates: 0 Warnings: 0
CREATE TABLE `demo_blog` (`id` int(11) NOT NULL AUTO_INCREMENT, `blog` varchar(100) NOT NULL, PRIMARY KEY (`id`)) ENGINE=MyISAM DEFAULT CHARSET=utf8;
CREATE TABLE `demo_log` (`operator` varchar(16) NOT NULL, `log` varchar(100) NOT NULL) ENGINE=MyISAM DEFAULT CHARSET=utf8;
https://hbase.apache.org/book/configuration.html#hadoop
http://www.tuicool.com/articles/NVfEVnn
이 내용에 흥미가 있습니까?
현재 기사가 여러분의 문제를 해결하지 못하는 경우 AI 엔진은 머신러닝 분석(스마트 모델이 방금 만들어져 부정확한 경우가 있을 수 있음)을 통해 가장 유사한 기사를 추천합니다:
Sqoop 이상 ERROR 도구.ImportTool: Imported Failed: column not foundSqoop 가져오기 데이터가 Hive 열에 이상 없음: sqoop을 사용하여 sqlserver에서 데이터를 하이브로 내보내는 중입니다. sqoop 문장은 다음과 같습니다. sqoop import --hive-impo...
텍스트를 자유롭게 공유하거나 복사할 수 있습니다.하지만 이 문서의 URL은 참조 URL로 남겨 두십시오.
CC BY-SA 2.5, CC BY-SA 3.0 및 CC BY-SA 4.0에 따라 라이센스가 부여됩니다.