真实的国产乱ⅩXXX66竹夫人,五月香六月婷婷激情综合,亚洲日本VA一区二区三区,亚洲精品一区二区三区麻豆

成都創(chuàng)新互聯(lián)網(wǎng)站制作重慶分公司

HBase數(shù)據(jù)導入ImportTsv-創(chuàng)新互聯(lián)

ImportTsv 工具是通過map reduce 完成的。所以要啟動yarn. 工具要使用jar包,所以注意配置classpath。ImportTsv默認是通過hbase api 插入數(shù)據(jù)的

公司專注于為企業(yè)提供成都網(wǎng)站建設、成都做網(wǎng)站、微信公眾號開發(fā)、商城系統(tǒng)網(wǎng)站開發(fā),微信小程序開發(fā),軟件按需求定制設計等一站式互聯(lián)網(wǎng)企業(yè)服務。憑借多年豐富的經(jīng)驗,我們會仔細了解各客戶的需求而做出多方面的分析、設計、整合,為客戶設計出具風格及創(chuàng)意性的商業(yè)解決方案,成都創(chuàng)新互聯(lián)更提供一系列網(wǎng)站制作和網(wǎng)站推廣的服務。

[hadoop-user@rhel work]$ cat /home/hadoop-user/.bash_profile

# .bash_profile

# Get the aliases and functions

if [ -f ~/.bashrc ]; then

    . ~/.bashrc

fi

# User specific environment and startup programs

PATH=$PATH:$HOME/bin

export PATH

JAVA_HOME=/usr/java/jdk1.8.0_171-amd64

PATH=$PATH:$JAVA_HOME/bin

CLASSPATH=$CLASSPATH:$JAVA_HOME/lib

HADOOP_HOME=/home/hadoop-user/hadoop-2.8.0

PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

CLASSPATH=$CLASSPATH:$HADOOP_HOME/lib

HBASE_HOME=/home/hadoop-user/hbase-2.0.0

PATH=$PATH:$HBASE_HOME/bin

CLASSPATH=$CLASSPATH:$HBASE_HOME/lib

ZOOKEEPER_HOME=/home/hadoop-user/zookeeper-3.4.12

PATH=$PATH:$ZOOKEEPER_HOME/bin

PHOENIX_HOME=/home/hadoop-user/apache-phoenix-5.0.0-alpha-HBase-2.0-bin

PATH=$PATH:$PHOENIX_HOME/bin

export PATH

創(chuàng)建表


hbase(main):033:0> create 'test','cf'

創(chuàng)建要導入的文件

[hadoop-user@rhel work]$ cat /home/hadoop-user/work/sample1.csv

row10,"mjj10"

row11,"mjj11"

row12,"mjj12"

row14,"mjj13"

將文件放入hdfs

[hadoop-user@rhel work]$ hdfs dfs -put /home/hadoop-user/work/sample1.csv /sample1.csv

ImportTsv導入命令

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator="," -Dimporttsv.columns=HBASE_ROW_KEY,cf:a test /sample1.csv

注: HBASE_ROW_KEY表示文件rowid的位置,后面是列的定義。這里意思是導入的列為列族為cf,列名為a。要導入的文件是hdfs中/sample1.csv

幫助中的解釋

Usage: importtsv -Dimporttsv.columns=a,b,c

Imports the given input directory of TSV data into the specified table.

The column names of the TSV data must be specified using the -Dimporttsv.columns

option. This option takes the form of comma-separated column names, where each

column name is either a simple column family, or a columnfamily:qualifier. The special

column name HBASE_ROW_KEY is used to designate that this column should be used

as the row key for each imported record. You must specify exactly one column

to be the row key, and you must specify a column name for every column that exists in the

input data. Another special columnHBASE_TS_KEY designates that this column should be

used as timestamp for each record. Unlike HBASE_ROW_KEY, HBASE_TS_KEY is optional.

You must specify at most one column as timestamp key for each imported record.

Record with invalid timestamps (blank, non-numeric) will be treated as bad record.

Note: if you use this option, then 'importtsv.timestamp' option will be ignored.

注意: ImportTsv導入的內(nèi)容,phoenix看不到。事實上,hbase創(chuàng)建的表,Phoenix看不到。phoenix創(chuàng)建的表,hbase能看到,但是內(nèi)容是編碼后的內(nèi)容。

importtsv 工具默認使用hbase put api導數(shù)據(jù).當使用選項 -Dimporttsv.bulk.output時,將會先生成HFILE文件的內(nèi)部格式的文件。

The importtsv tool, by default, uses the HBase Put API to insert data into the HBase

table using TableOutputFormat in its map phase. But when the -Dimporttsv.bulk.output option is specified, it instead generates HBase internal format (HFile) files on HDFS

by using HFileOutputFormat. Therefore, we can then use the completebulkload tool to load the generated files into a running cluster. The following steps are to use the bulk output and load tools:

生成HFILE格式的文件命令

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator="," -Dimporttsv.bulk.output=/hfiles_tsv -Dimporttsv.columns=HBASE_ROW_KEY,cf:a test /sample1.csv

注: 生成hfile格式的文件,存放于hdfs中的/hfile_tsv目錄中,目錄會由命令自己創(chuàng)建。

[hadoop-user@rhel work]$ hdfs dfs -ls /hfiles_tsv/cf

18/06/28 10:49:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Found 1 items

-rw-r--r--  1 hadoop-user supergroup    5125 2018-06-28 10:40 /hfiles_tsv/cf/0e466616d42a4a128fb60caa7dbe075a

注: 0e466616d42a4a128fb60caa7dbe075a的命名格式跟WEB中region的命名格式很像

通過


hadoop jar hbase-server-2.0.0.jar completebulkload /hfiles_tsv 'test'

出現(xiàn)異常; Exception in thread "main" java.lang.ClassNotFoundException: completebulkload

HBASE文檔中兩種導入方式:

There are two ways to invoke this utility, with explicit classname and via the driver:

Explicit Classname

$ bin/hbase org.apache.hadoop.hbase.tool.LoadIncrementalHFiles  

Driver

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-server-VERSION.jar completebulkload  

另外有需要云服務器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內(nèi)外云服務器15元起步,三天無理由+7*72小時售后在線,公司持有idc許可證,提供“云服務器、裸金屬服務器、高防服務器、香港服務器、美國服務器、虛擬主機、免備案服務器”等云主機租用服務以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡單易用、服務可用性高、性價比高”等特點與優(yōu)勢,專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應用場景需求。


網(wǎng)頁題目:HBase數(shù)據(jù)導入ImportTsv-創(chuàng)新互聯(lián)
網(wǎng)頁地址:http://weahome.cn/article/ijhgp.html

其他資訊

在線咨詢

微信咨詢

電話咨詢

028-86922220(工作日)

18980820575(7×24)

提交需求

返回頂部