하나의 Job 에 여러 개의 HBase 를 동시에 기록 하 는 table

5069 단어 hbase
맵 / Reduce 를 진행 할 때, 어떤 업 무 는 하나의 job 에서 데 이 터 를 여러 개의 HBase 표 에 기록 해 야 하 며, 다음은 실현 방식 이다.
주소
HBase MultiTableOutputFormat writing to multiple tables in one Map Reduce Job
 
Recently, I've been having a lot of fun learning about HBase and Hadoop. One esoteric thing I just learned about is the way that HBase tables are populated.
By default, HBase / Map Reduce jobs can only write to a single table because you set the output handler at the job level with the job.setOutputFormatClass(). However, if you are creating an HBase table, chances are that you are going to want to build an index related to that table so that you can do fast queries on the master table. The most optimal way to do this is to write the data to both tables at the same time when you are importing the data. The alternative is to write another M/R job to do this after the fact, but that means reading all of the data twice, which is a lot of extra load on the system for no real benefit. Thus, in order to write to both tables at the same time, in the same M/R job, you need to take advantage of the MultiTableOutputFormat class to achieve this result. The key here is that when you write to the context, you specify the name of the table you are writing to. This is some basic example code (with a lot of the meat removed) which demonstrates this.
static class TsvImporter extends Mapper<LongWritable, Text, ImmutableBytesWritable, Put> {
    @Override
    public void map(LongWritable offset, Text value, Context context) throws IOException {
        // contains the line of tab separated data we are working on (needs to be parsed out).
        byte[] lineBytes = value.getBytes();

        // rowKey is the hbase rowKey generated from lineBytes
        Put put = new Put(rowKey);
        // Create your KeyValue object
        put.add(kv);
        context.write("actions", put); // write to the actions table

        // rowKey2 is the hbase rowKey
        Put put = new Put(rowKey2);
        // Create your KeyValue object
        put.add(kv);
        context.write("actions_index", put); // write to the actions table
    }
}

public static Job createSubmittableJob(Configuration conf, String[] args) throws IOException {
    String pathStr = args[0];
    Path inputDir = new Path(pathStr);
    Job job = new Job(conf, "my_custom_job");
    job.setJarByClass(TsvImporter.class);
    FileInputFormat.setInputPaths(job, inputDir);
    job.setInputFormatClass(TextInputFormat.class);
    
    // this is the key to writing to multiple tables in hbase
    job.setOutputFormatClass(MultiTableOutputFormat.class);
    job.setMapperClass(TsvImporter.class);
    job.setNumReduceTasks(0);

    TableMapReduceUtil.addDependencyJars(job);
    TableMapReduceUtil.addDependencyJars(job.getConfiguration());
    return job;
}
 

 
 
 
 
 
 

좋은 웹페이지 즐겨찾기