(fileInfo, size));
558 | }
559 | });
560 |
561 | return files;
562 | }
563 |
564 | /**
565 | * Given a list of file paths and sizes, create around ngroups in as balanced a way as possible.
566 | * The groups created will have similar amounts of bytes.
567 | *
568 | * The algorithm used is pretty straightforward; the file list is sorted by size,
569 | * and then each group fetch the bigger file available, iterating through groups
570 | * alternating the direction.
571 | */
572 | static List>> getBalancedSplits(
573 | final List> files, final int ngroups) {
574 | // Sort files by size, from small to big
575 | Collections.sort(files, new Comparator>() {
576 | public int compare(Pair a, Pair b) {
577 | long r = a.getSecond() - b.getSecond();
578 | return (r < 0) ? -1 : ((r > 0) ? 1 : 0);
579 | }
580 | });
581 |
582 | // create balanced groups
583 | List>> fileGroups =
584 | new LinkedList>>();
585 | long[] sizeGroups = new long[ngroups];
586 | int hi = files.size() - 1;
587 | int lo = 0;
588 |
589 | List> group;
590 | int dir = 1;
591 | int g = 0;
592 |
593 | while (hi >= lo) {
594 | if (g == fileGroups.size()) {
595 | group = new LinkedList>();
596 | fileGroups.add(group);
597 | } else {
598 | group = fileGroups.get(g);
599 | }
600 |
601 | Pair fileInfo = files.get(hi--);
602 |
603 | // add the hi one
604 | sizeGroups[g] += fileInfo.getSecond();
605 | group.add(fileInfo);
606 |
607 | // change direction when at the end or the beginning
608 | g += dir;
609 | if (g == ngroups) {
610 | dir = -1;
611 | g = ngroups - 1;
612 | } else if (g < 0) {
613 | dir = 1;
614 | g = 0;
615 | }
616 | }
617 |
618 | if (LOG.isDebugEnabled()) {
619 | for (int i = 0; i < sizeGroups.length; ++i) {
620 | LOG.debug("export split=" + i + " size=" + StringUtils.humanReadableInt(sizeGroups[i]));
621 | }
622 | }
623 |
624 | return fileGroups;
625 | }
626 |
627 | private static class ExportSnapshotInputFormat extends InputFormat {
628 | @Override
629 | public RecordReader createRecordReader(InputSplit split,
630 | TaskAttemptContext tac) throws IOException, InterruptedException {
631 | return new ExportSnapshotRecordReader(((ExportSnapshotInputSplit)split).getSplitKeys());
632 | }
633 |
634 | @Override
635 | public List getSplits(JobContext context) throws IOException, InterruptedException {
636 | Configuration conf = context.getConfiguration();
637 | Path snapshotDir = new Path(conf.get(CONF_SNAPSHOT_DIR));
638 | FileSystem fs = FileSystem.get(snapshotDir.toUri(), conf);
639 |
640 | List> snapshotFiles = new ArrayList<>();
641 | List> origSnapshotFiles = getSnapshotFiles(conf, fs, snapshotDir);
642 |
643 | //TODO 增加快照增量文件判断逻辑
644 | String snapshotOldPath = conf.get(CONF_SNAPSHOT_OLD_DIR);
645 | if(snapshotOldPath != null && snapshotOldPath.length() > 0) {
646 | LOG.info("测试snapshotOldDir:" + conf.get(CONF_SNAPSHOT_OLD_DIR));
647 | Path snapshotOldDir = new Path(conf.get(CONF_SNAPSHOT_OLD_DIR));
648 | FileSystem oldFs = FileSystem.get(snapshotOldDir.toUri(), conf);
649 |
650 | List> snapshotOldFiles = getSnapshotFiles(conf, oldFs, snapshotOldDir);
651 | Set oldFiles = new HashSet();
652 | for(Pair pair : snapshotOldFiles) {
653 | LOG.info("测试OLD:SnapShotFileInfo:" + pair.getFirst().getHfile());
654 | oldFiles.add(pair.getFirst().getHfile());
655 | }
656 |
657 | for(Pair pair : origSnapshotFiles) {
658 | String hfile = pair.getFirst().getHfile();
659 | if(!oldFiles.contains(hfile)) {
660 | LOG.info("新增的:SnapShotFileInfo:" + pair.getFirst().getHfile());
661 | snapshotFiles.add(pair);
662 | }
663 | }
664 | } else {
665 | snapshotFiles = origSnapshotFiles;
666 | }
667 |
668 | int mappers = conf.getInt(CONF_NUM_SPLITS, 0);
669 | if (mappers == 0 && snapshotFiles.size() > 0) {
670 | mappers = 1 + (snapshotFiles.size() / conf.getInt(CONF_MAP_GROUP, 10));
671 | mappers = Math.min(mappers, snapshotFiles.size());
672 | conf.setInt(CONF_NUM_SPLITS, mappers);
673 | conf.setInt(MR_NUM_MAPS, mappers);
674 | }
675 |
676 | List>> groups = getBalancedSplits(snapshotFiles, mappers);
677 | List splits = new ArrayList(groups.size());
678 | for (List> files: groups) {
679 | splits.add(new ExportSnapshotInputSplit(files));
680 | }
681 | return splits;
682 | }
683 |
684 | private static class ExportSnapshotInputSplit extends InputSplit implements Writable {
685 | private List> files;
686 | private long length;
687 |
688 | public ExportSnapshotInputSplit() {
689 | this.files = null;
690 | }
691 |
692 | public ExportSnapshotInputSplit(final List> snapshotFiles) {
693 | this.files = new ArrayList(snapshotFiles.size());
694 | for (Pair fileInfo: snapshotFiles) {
695 | this.files.add(new Pair(
696 | new BytesWritable(fileInfo.getFirst().toByteArray()), fileInfo.getSecond()));
697 | this.length += fileInfo.getSecond();
698 | }
699 | }
700 |
701 | private List> getSplitKeys() {
702 | return files;
703 | }
704 |
705 | @Override
706 | public long getLength() throws IOException, InterruptedException {
707 | return length;
708 | }
709 |
710 | @Override
711 | public String[] getLocations() throws IOException, InterruptedException {
712 | return new String[] {};
713 | }
714 |
715 | @Override
716 | public void readFields(DataInput in) throws IOException {
717 | int count = in.readInt();
718 | files = new ArrayList>(count);
719 | length = 0;
720 | for (int i = 0; i < count; ++i) {
721 | BytesWritable fileInfo = new BytesWritable();
722 | fileInfo.readFields(in);
723 | long size = in.readLong();
724 | files.add(new Pair(fileInfo, size));
725 | length += size;
726 | }
727 | }
728 |
729 | @Override
730 | public void write(DataOutput out) throws IOException {
731 | out.writeInt(files.size());
732 | for (final Pair fileInfo: files) {
733 |
734 | fileInfo.getFirst().write(out);
735 | out.writeLong(fileInfo.getSecond());
736 | }
737 | }
738 | }
739 |
740 | private static class ExportSnapshotRecordReader
741 | extends RecordReader {
742 | private final List> files;
743 | private long totalSize = 0;
744 | private long procSize = 0;
745 | private int index = -1;
746 |
747 | ExportSnapshotRecordReader(final List> files) {
748 | this.files = files;
749 | for (Pair fileInfo: files) {
750 | totalSize += fileInfo.getSecond();
751 | }
752 | }
753 |
754 | @Override
755 | public void close() { }
756 |
757 | @Override
758 | public BytesWritable getCurrentKey() { return files.get(index).getFirst(); }
759 |
760 | @Override
761 | public NullWritable getCurrentValue() { return NullWritable.get(); }
762 |
763 | @Override
764 | public float getProgress() { return (float)procSize / totalSize; }
765 |
766 | @Override
767 | public void initialize(InputSplit split, TaskAttemptContext tac) { }
768 |
769 | @Override
770 | public boolean nextKeyValue() {
771 | if (index >= 0) {
772 | procSize += files.get(index).getSecond();
773 | }
774 | return(++index < files.size());
775 | }
776 | }
777 | }
778 |
779 | // ==========================================================================
780 | // Tool
781 | // ==========================================================================
782 |
783 | /**
784 | * Run Map-Reduce Job to perform the files copy.
785 | */
786 | private void runCopyJob(final Path inputRoot, final Path outputRoot,
787 | final String snapshotName, final Path snapshotDir, final boolean verifyChecksum,
788 | final String filesUser, final String filesGroup, final int filesMode,
789 | final int mappers, final int bandwidthMB, final String snapshotOldName, final Path snapShotOldDir)
790 | throws IOException, InterruptedException, ClassNotFoundException {
791 | Configuration conf = getConf();
792 | if (filesGroup != null) conf.set(CONF_FILES_GROUP, filesGroup);
793 | if (filesUser != null) conf.set(CONF_FILES_USER, filesUser);
794 | if (mappers > 0) {
795 | conf.setInt(CONF_NUM_SPLITS, mappers);
796 | conf.setInt(MR_NUM_MAPS, mappers);
797 | }
798 | conf.setInt(CONF_FILES_MODE, filesMode);
799 | conf.setBoolean(CONF_CHECKSUM_VERIFY, verifyChecksum);
800 | conf.set(CONF_OUTPUT_ROOT, outputRoot.toString());
801 | conf.set(CONF_INPUT_ROOT, inputRoot.toString());
802 | conf.setInt(CONF_BANDWIDTH_MB, bandwidthMB);
803 | conf.set(CONF_SNAPSHOT_NAME, snapshotName);
804 | conf.set(CONF_SNAPSHOT_DIR, snapshotDir.toString());
805 |
806 | if(snapshotOldName != null) {
807 | conf.set(CONF_SNAPSHOT_OLD_NAME, snapshotOldName);
808 | }
809 | if(snapShotOldDir != null) {
810 | conf.set(CONF_SNAPSHOT_OLD_DIR, snapShotOldDir.toString());
811 | }
812 |
813 | Job job = new Job(conf);
814 | job.setJobName("ExportSnapshot-" + snapshotName);
815 | if(snapshotOldName != null) {
816 | job.setJobName("ExportSnapshotIncremental-" + snapshotName);
817 | }
818 |
819 | job.setJarByClass(ExportSnapshot.class);
820 | TableMapReduceUtil.addDependencyJars(job);
821 | job.setMapperClass(ExportMapper.class);
822 | job.setInputFormatClass(ExportSnapshotInputFormat.class);
823 | job.setOutputFormatClass(NullOutputFormat.class);
824 | job.setMapSpeculativeExecution(false);
825 | job.setNumReduceTasks(0);
826 |
827 | // Acquire the delegation Tokens
828 | Configuration srcConf = HBaseConfiguration.createClusterConf(conf, null, CONF_SOURCE_PREFIX);
829 | TokenCache.obtainTokensForNamenodes(job.getCredentials(), new Path[] { inputRoot }, srcConf);
830 |
831 | Configuration destConf = HBaseConfiguration.createClusterConf(conf, null, CONF_DEST_PREFIX);
832 | TokenCache.obtainTokensForNamenodes(job.getCredentials(), new Path[] { outputRoot }, destConf);
833 |
834 | // Run the MR Job
835 | if (!job.waitForCompletion(true)) {
836 | throw new ExportSnapshotException(job.getStatus().getFailureInfo());
837 | }
838 | }
839 |
840 | private void verifySnapshot(final Configuration baseConf,
841 | final FileSystem fs, final Path rootDir, final Path snapshotDir) throws IOException {
842 | // Update the conf with the current root dir, since may be a different cluster
843 | Configuration conf = new Configuration(baseConf);
844 | FSUtils.setRootDir(conf, rootDir);
845 | FSUtils.setFsDefault(conf, FSUtils.getRootDir(conf));
846 | SnapshotDescription snapshotDesc = SnapshotDescriptionUtils.readSnapshotInfo(fs, snapshotDir);
847 | SnapshotReferenceUtil.verifySnapshot(conf, fs, snapshotDir, snapshotDesc);
848 | }
849 |
850 | /**
851 | * Set path ownership.
852 | */
853 | private void setOwner(final FileSystem fs, final Path path, final String user,
854 | final String group, final boolean recursive) throws IOException {
855 | if (user != null || group != null) {
856 | if (recursive && fs.isDirectory(path)) {
857 | for (FileStatus child : fs.listStatus(path)) {
858 | setOwner(fs, child.getPath(), user, group, recursive);
859 | }
860 | }
861 | fs.setOwner(path, user, group);
862 | }
863 | }
864 |
865 | /**
866 | * Set path permission.
867 | */
868 | private void setPermission(final FileSystem fs, final Path path, final short filesMode,
869 | final boolean recursive) throws IOException {
870 | if (filesMode > 0) {
871 | FsPermission perm = new FsPermission(filesMode);
872 | if (recursive && fs.isDirectory(path)) {
873 | for (FileStatus child : fs.listStatus(path)) {
874 | setPermission(fs, child.getPath(), filesMode, recursive);
875 | }
876 | }
877 | fs.setPermission(path, perm);
878 | }
879 | }
880 |
881 | /**
882 | * Execute the export snapshot by copying the snapshot metadata, hfiles and wals.
883 | * @return 0 on success, and != 0 upon failure.
884 | */
885 | @Override
886 | public int run(String[] args) throws IOException {
887 | boolean verifyTarget = true;
888 | boolean verifyChecksum = true;
889 | String snapshotName = null;
890 | String snapshotOldName = null;
891 | String targetName = null;
892 | boolean overwrite = false;
893 | String filesGroup = null;
894 | String filesUser = null;
895 | Path outputRoot = null;
896 | int bandwidthMB = Integer.MAX_VALUE;
897 | int filesMode = 0;
898 | int mappers = 0;
899 |
900 | Configuration conf = getConf();
901 | Path inputRoot = FSUtils.getRootDir(conf);
902 |
903 | // Process command line args
904 | for (int i = 0; i < args.length; i++) {
905 | String cmd = args[i];
906 | if (cmd.equals("-snapshot")) {
907 | snapshotName = args[++i];
908 | } else if (cmd.equals("-snapshot-old")) {
909 | snapshotOldName = args[++i];
910 | } else if (cmd.equals("-target")) {
911 | targetName = args[++i];
912 | } else if (cmd.equals("-copy-to")) {
913 | outputRoot = new Path(args[++i]);
914 | } else if (cmd.equals("-copy-from")) {
915 | inputRoot = new Path(args[++i]);
916 | FSUtils.setRootDir(conf, inputRoot);
917 | } else if (cmd.equals("-no-checksum-verify")) {
918 | verifyChecksum = false;
919 | } else if (cmd.equals("-no-target-verify")) {
920 | verifyTarget = false;
921 | } else if (cmd.equals("-mappers")) {
922 | mappers = Integer.parseInt(args[++i]);
923 | } else if (cmd.equals("-chuser")) {
924 | filesUser = args[++i];
925 | } else if (cmd.equals("-chgroup")) {
926 | filesGroup = args[++i];
927 | } else if (cmd.equals("-bandwidth")) {
928 | bandwidthMB = Integer.parseInt(args[++i]);
929 | } else if (cmd.equals("-chmod")) {
930 | filesMode = Integer.parseInt(args[++i], 8);
931 | } else if (cmd.equals("-overwrite")) {
932 | overwrite = true;
933 | } else if (cmd.equals("-h") || cmd.equals("--help")) {
934 | printUsageAndExit();
935 | } else {
936 | System.err.println("UNEXPECTED: " + cmd);
937 | printUsageAndExit();
938 | }
939 | }
940 |
941 | // Check user options
942 | if (snapshotName == null) {
943 | System.err.println("Snapshot name not provided.");
944 | printUsageAndExit();
945 | }
946 |
947 | if (outputRoot == null) {
948 | System.err.println("Destination file-system not provided.");
949 | printUsageAndExit();
950 | }
951 |
952 | if (targetName == null) {
953 | targetName = snapshotName;
954 | }
955 |
956 | Configuration srcConf = HBaseConfiguration.createClusterConf(conf, null, CONF_SOURCE_PREFIX);
957 | srcConf.setBoolean("fs." + inputRoot.toUri().getScheme() + ".impl.disable.cache", true);
958 | FileSystem inputFs = FileSystem.get(inputRoot.toUri(), srcConf);
959 | LOG.debug("inputFs=" + inputFs.getUri().toString() + " inputRoot=" + inputRoot);
960 |
961 | Configuration destConf = HBaseConfiguration.createClusterConf(conf, null, CONF_DEST_PREFIX);
962 | destConf.setBoolean("fs." + outputRoot.toUri().getScheme() + ".impl.disable.cache", true);
963 | FileSystem outputFs = FileSystem.get(outputRoot.toUri(), destConf);
964 | LOG.debug("outputFs=" + outputFs.getUri().toString() + " outputRoot=" + outputRoot.toString());
965 |
966 | boolean skipTmp = conf.getBoolean(CONF_SKIP_TMP, false);
967 |
968 | Path snapshotDir = SnapshotDescriptionUtils.getCompletedSnapshotDir(snapshotName, inputRoot);
969 |
970 | Path snapShotOldDir = null;
971 | if(snapshotOldName != null) {
972 | snapShotOldDir = SnapshotDescriptionUtils.getCompletedSnapshotDir(snapshotOldName, inputRoot);
973 | }
974 |
975 | Path snapshotTmpDir = SnapshotDescriptionUtils.getWorkingSnapshotDir(targetName, outputRoot);
976 | Path outputSnapshotDir = SnapshotDescriptionUtils.getCompletedSnapshotDir(targetName, outputRoot);
977 | Path initialOutputSnapshotDir = skipTmp ? outputSnapshotDir : snapshotTmpDir;
978 |
979 | // Find the necessary directory which need to change owner and group
980 | Path needSetOwnerDir = SnapshotDescriptionUtils.getSnapshotRootDir(outputRoot);
981 | if (outputFs.exists(needSetOwnerDir)) {
982 | if (skipTmp) {
983 | needSetOwnerDir = outputSnapshotDir;
984 | } else {
985 | needSetOwnerDir = SnapshotDescriptionUtils.getWorkingSnapshotDir(outputRoot);
986 | if (outputFs.exists(needSetOwnerDir)) {
987 | needSetOwnerDir = snapshotTmpDir;
988 | }
989 | }
990 | }
991 |
992 | // Check if the snapshot already exists
993 | if (outputFs.exists(outputSnapshotDir)) {
994 | if (overwrite) {
995 | if (!outputFs.delete(outputSnapshotDir, true)) {
996 | System.err.println("Unable to remove existing snapshot directory: " + outputSnapshotDir);
997 | return 1;
998 | }
999 | } else {
1000 | System.err.println("The snapshot '" + targetName +
1001 | "' already exists in the destination: " + outputSnapshotDir);
1002 | return 1;
1003 | }
1004 | }
1005 |
1006 | if (!skipTmp) {
1007 | // Check if the snapshot already in-progress
1008 | if (outputFs.exists(snapshotTmpDir)) {
1009 | if (overwrite) {
1010 | if (!outputFs.delete(snapshotTmpDir, true)) {
1011 | System.err.println("Unable to remove existing snapshot tmp directory: "+snapshotTmpDir);
1012 | return 1;
1013 | }
1014 | } else {
1015 | System.err.println("A snapshot with the same name '"+ targetName +"' may be in-progress");
1016 | System.err.println("Please check "+snapshotTmpDir+". If the snapshot has completed, ");
1017 | System.err.println("consider removing "+snapshotTmpDir+" by using the -overwrite option");
1018 | return 1;
1019 | }
1020 | }
1021 | }
1022 |
1023 | // Step 1 - Copy fs1:/.snapshot/ to fs2:/.snapshot/.tmp/
1024 | // The snapshot references must be copied before the hfiles otherwise the cleaner
1025 | // will remove them because they are unreferenced.
1026 | try {
1027 | LOG.info("Copy Snapshot Manifest");
1028 | FileUtil.copy(inputFs, snapshotDir, outputFs, initialOutputSnapshotDir, false, false, conf);
1029 | } catch (IOException e) {
1030 | throw new ExportSnapshotException("Failed to copy the snapshot directory: from=" +
1031 | snapshotDir + " to=" + initialOutputSnapshotDir, e);
1032 | } finally {
1033 | if (filesUser != null || filesGroup != null) {
1034 | LOG.warn((filesUser == null ? "" : "Change the owner of " + needSetOwnerDir + " to "
1035 | + filesUser)
1036 | + (filesGroup == null ? "" : ", Change the group of " + needSetOwnerDir + " to "
1037 | + filesGroup));
1038 | setOwner(outputFs, needSetOwnerDir, filesUser, filesGroup, true);
1039 | }
1040 | if (filesMode > 0) {
1041 | LOG.warn("Change the permission of " + needSetOwnerDir + " to " + filesMode);
1042 | setPermission(outputFs, needSetOwnerDir, (short)filesMode, true);
1043 | }
1044 | }
1045 |
1046 | // Write a new .snapshotinfo if the target name is different from the source name
1047 | if (!targetName.equals(snapshotName)) {
1048 | SnapshotDescription snapshotDesc =
1049 | SnapshotDescriptionUtils.readSnapshotInfo(inputFs, snapshotDir)
1050 | .toBuilder()
1051 | .setName(targetName)
1052 | .build();
1053 | SnapshotDescriptionUtils.writeSnapshotInfo(snapshotDesc, snapshotTmpDir, outputFs);
1054 | }
1055 |
1056 | // Step 2 - Start MR Job to copy files
1057 | // The snapshot references must be copied before the files otherwise the files gets removed
1058 | // by the HFileArchiver, since they have no references.
1059 | try {
1060 | runCopyJob(inputRoot, outputRoot, snapshotName, snapshotDir, verifyChecksum,
1061 | filesUser, filesGroup, filesMode, mappers, bandwidthMB, snapshotOldName, snapShotOldDir);
1062 |
1063 | LOG.info("Finalize the Snapshot Export");
1064 | if (!skipTmp) {
1065 | // Step 3 - Rename fs2:/.snapshot/.tmp/ fs2:/.snapshot/
1066 | if (!outputFs.rename(snapshotTmpDir, outputSnapshotDir)) {
1067 | throw new ExportSnapshotException("Unable to rename snapshot directory from=" +
1068 | snapshotTmpDir + " to=" + outputSnapshotDir);
1069 | }
1070 | }
1071 |
1072 | // Step 4 - Verify snapshot integrity
1073 | if (verifyTarget) {
1074 | LOG.info("Verify snapshot integrity");
1075 | //TODO 如果全量导出则校验
1076 | if(snapshotOldName == null || snapshotOldName.length() == 0) {
1077 | verifySnapshot(destConf, outputFs, outputRoot, outputSnapshotDir);
1078 | }
1079 | }
1080 |
1081 | LOG.info("Export Completed: " + targetName);
1082 | return 0;
1083 | } catch (Exception e) {
1084 | LOG.error("Snapshot export failed", e);
1085 | if (!skipTmp) {
1086 | outputFs.delete(snapshotTmpDir, true);
1087 | }
1088 | outputFs.delete(outputSnapshotDir, true);
1089 | return 1;
1090 | } finally {
1091 | IOUtils.closeStream(inputFs);
1092 | IOUtils.closeStream(outputFs);
1093 | }
1094 | }
1095 |
1096 | // ExportSnapshot
1097 | private void printUsageAndExit() {
1098 | System.err.printf("Usage: bin/hbase %s [options]%n", getClass().getName());
1099 | System.err.println(" where [options] are:");
1100 | System.err.println(" -h|-help Show this help and exit.");
1101 | System.err.println(" -snapshot NAME Snapshot to restore.");
1102 | System.err.println(" -snapshot-old NAME 此快照用于导出增量数据对比.");
1103 | System.err.println(" -copy-to NAME Remote destination hdfs://");
1104 | System.err.println(" -copy-from NAME Input folder hdfs:// (default hbase.rootdir)");
1105 | System.err.println(" -no-checksum-verify Do not verify checksum, use name+length only.");
1106 | System.err.println(" -no-target-verify Do not verify the integrity of the \\" +
1107 | "exported snapshot.");
1108 | System.err.println(" -overwrite Rewrite the snapshot manifest if already exists");
1109 | System.err.println(" -chuser USERNAME Change the owner of the files " +
1110 | "to the specified one.");
1111 | System.err.println(" -chgroup GROUP Change the group of the files to " +
1112 | "the specified one.");
1113 | System.err.println(" -chmod MODE Change the permission of the files " +
1114 | "to the specified one.");
1115 | System.err.println(" -mappers Number of mappers to use during the " +
1116 | "copy (mapreduce.job.maps).");
1117 | System.err.println(" -bandwidth Limit bandwidth to this value in MB/second.");
1118 | System.err.println();
1119 | System.err.println("Examples:");
1120 | System.err.println(" hbase " + getClass().getName() + " \\");
1121 | System.err.println(" -snapshot MySnapshot -copy-to hdfs://srv2:8082/hbase \\");
1122 | System.err.println(" -chuser MyUser -chgroup MyGroup -chmod 700 -mappers 16");
1123 | System.err.println();
1124 | System.err.println(" hbase " + getClass().getName() + " \\");
1125 | System.err.println(" -snapshot MySnapshot -copy-from hdfs://srv2:8082/hbase \\");
1126 | System.err.println(" -copy-to hdfs://srv1:50070/hbase \\");
1127 | System.exit(1);
1128 | }
1129 |
1130 | /**
1131 | * The guts of the {@link #main} method.
1132 | * Call this method to avoid the {@link #main(String[])} System.exit.
1133 | * @param args
1134 | * @return errCode
1135 | * @throws Exception
1136 | */
1137 | static int innerMain(final Configuration conf, final String [] args) throws Exception {
1138 | return ToolRunner.run(conf, new ExportSnapshot(), args);
1139 | }
1140 |
1141 | public static void main(String[] args) throws Exception {
1142 | System.exit(innerMain(HBaseConfiguration.create(), args));
1143 | }
1144 | }
--------------------------------------------------------------------------------
/target/balancer.jar:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/balancer.jar
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$1.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$1.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$2.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$2.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$3.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$3.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$Counter.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$Counter.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportMapper.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportMapper.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotInputSplit.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotInputSplit.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotRecordReader.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotRecordReader.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat.class
--------------------------------------------------------------------------------
/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot.class:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/classes/org/hadoop/hbase/dataExport/ExportSnapshot.class
--------------------------------------------------------------------------------
/target/hbase-export-1.0-SNAPSHOT.jar:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/hbase-export-1.0-SNAPSHOT.jar
--------------------------------------------------------------------------------
/target/maven-archiver/pom.properties:
--------------------------------------------------------------------------------
1 | #Generated by Maven
2 | #Thu Nov 02 17:36:22 CST 2017
3 | version=1.0-SNAPSHOT
4 | groupId=org.hadoop.hbase.dataExport
5 | artifactId=hbase-export
6 |
--------------------------------------------------------------------------------
/target/maven-status/maven-compiler-plugin/compile/default-compile/createdFiles.lst:
--------------------------------------------------------------------------------
1 | org/hadoop/hbase/dataExport/ExportSnapshot.class
2 | org/hadoop/hbase/dataExport/ExportSnapshot$ExportMapper.class
3 | org/hadoop/hbase/dataExport/ExportSnapshot$2.class
4 | org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotInputSplit.class
5 | org/hadoop/hbase/dataExport/ExportSnapshot$Counter.class
6 | org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat$ExportSnapshotRecordReader.class
7 | org/hadoop/hbase/dataExport/ExportSnapshot$3.class
8 | org/hadoop/hbase/dataExport/ExportSnapshot$1.class
9 | org/hadoop/hbase/dataExport/ExportSnapshot$ExportSnapshotInputFormat.class
10 |
--------------------------------------------------------------------------------
/target/maven-status/maven-compiler-plugin/compile/default-compile/inputFiles.lst:
--------------------------------------------------------------------------------
1 | /Volumes/Transcend/work/hbaseexport/src/main/java/org/hadoop/hbase/dataExport/ExportSnapshot.java
2 |
--------------------------------------------------------------------------------
/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/inputFiles.lst:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/fayson/hbaseexport/008d4898b6bd411e19fbbf27a2d9ea790ca7d76f/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/inputFiles.lst
--------------------------------------------------------------------------------