Register fragments

Once the fragments of the scene are created, the next step is to align them in a global space.

Input arguments

This script runs with python run_system.py [config] --register. In [config], ["path_dataset"] should have subfolders fragments which stores fragments in .ply files and a pose graph in a .json file.

The main function runs make_posegraph_for_scene and optimize_posegraph_for_scene. The first function performs pairwise registration. The second function performs multiway registration.

Preprocess point cloud

41    pcd_down.estimate_normals(
42            source, target, source_fpfh, target_fpfh,
43            o3d.pipelines.registration.FastGlobalRegistrationOption(
44                maximum_correspondence_distance=distance_threshold))
45    if config["global_registration"] == "ransac":
46        # Fallback to preset parameters that works better
47        result = o3d.pipelines.registration.registration_ransac_based_on_feature_matching(
48            source, target, source_fpfh, target_fpfh, False, distance_threshold,
49            o3d.pipelines.registration.TransformationEstimationPointToPoint(
50                False), 4,
51            [
52                o3d.pipelines.registration.
53                CorrespondenceCheckerBasedOnEdgeLength(0.9),
54                o3d.pipelines.registration.CorrespondenceCheckerBasedOnDistance(

This function downsamples a point cloud to make it sparser and regularly distributed. Normals and FPFH feature are precomputed. See /tutorial/geometry/pointcloud.ipynb#voxel-downsampling, /tutorial/geometry/pointcloud.ipynb#vertex-normal-estimation, and /tutorial/pipelines/global_registration.ipynb#extract-geometric-feature for more details.

Compute initial registration

 85    pcd_down.estimate_normals(
 86                                                  config)
 87        if not success:
 88            print("No reasonable solution. Skip this pair")
 89            return (False, np.identity(4), np.zeros((6, 6)))
 90    print(transformation)
 91
 92    if config["debug_mode"]:
 93        draw_registration_result(source_down, target_down, transformation)
 94    return (True, transformation, information)
 95
 96
 97def update_posegraph_for_scene(s, t, transformation, information, odometry,
 98                               pose_graph):
 99    if t == s + 1:  # odometry case
100        odometry = np.dot(transformation, odometry)
101        odometry_inv = np.linalg.inv(odometry)
102        pose_graph.nodes.append(
103            o3d.pipelines.registration.PoseGraphNode(odometry_inv))
104        pose_graph.edges.append(
105            o3d.pipelines.registration.PoseGraphEdge(s,
106                                                     t,
107                                                     transformation,
108                                                     information,
109                                                     uncertain=False))
110    else:  # loop closure case
111        pose_graph.edges.append(
112            o3d.pipelines.registration.PoseGraphEdge(s,
113                                                     t,
114                                                     transformation,

This function computes a rough alignment between two fragments. If the fragments are neighboring fragments, the rough alignment is determined by an aggregating RGBD odometry obtained from Make fragments. Otherwise, register_point_cloud_fpfh is called to perform global registration. Note that global registration is less reliable according to [Choi2015].

Pairwise global registration

54    pcd_down.estimate_normals(
55                    distance_threshold)
56            ],
57            o3d.pipelines.registration.RANSACConvergenceCriteria(
58                1000000, 0.999))
59    if (result.transformation.trace() == 4.0):
60        return (False, np.identity(4), np.zeros((6, 6)))
61    information = o3d.pipelines.registration.get_information_matrix_from_point_clouds(
62        source, target, distance_threshold, result.transformation)
63    if information[5, 5] / min(len(source.points), len(target.points)) < 0.3:
64        return (False, np.identity(4), np.zeros((6, 6)))
65    return (True, result.transformation, information)
66
67
68def compute_initial_registration(s, t, source_down, target_down, source_fpfh,
69                                 target_fpfh, path_dataset, config):
70
71    if t == s + 1:  # odometry case
72        print("Using RGBD odometry")
73        pose_graph_frag = o3d.io.read_pose_graph(
74            join(path_dataset,
75                 config["template_fragment_posegraph_optimized"] % s))
76        n_nodes = len(pose_graph_frag.nodes)
77        transformation_init = np.linalg.inv(pose_graph_frag.nodes[n_nodes -
78                                                                  1].pose)
79        (transformation, information) = \
80                multiscale_icp(source_down, target_down,
81                [config["voxel_size"]], [50], config, transformation_init)
82    else:  # loop closure case
83        (success, transformation,
84         information) = register_point_cloud_fpfh(source_down, target_down,
85                                                  source_fpfh, target_fpfh,

This function uses /tutorial/pipelines/global_registration.ipynb#RANSAC or /tutorial/pipelines/global_registration.ipynb#fast-global-registration for pairwise global registration.

Multiway registration

114    pcd_down.estimate_normals(
115                                                     information,
116                                                     uncertain=True))
117    return (odometry, pose_graph)
118
119
120def register_point_cloud_pair(ply_file_names, s, t, config):
121    print("reading %s ..." % ply_file_names[s])
122    source = o3d.io.read_point_cloud(ply_file_names[s])
123    print("reading %s ..." % ply_file_names[t])
124    target = o3d.io.read_point_cloud(ply_file_names[t])
125    (source_down, source_fpfh) = preprocess_point_cloud(source, config)
126    (target_down, target_fpfh) = preprocess_point_cloud(target, config)
127    (success, transformation, information) = \
128            compute_initial_registration(
129            s, t, source_down, target_down,
130            source_fpfh, target_fpfh, config["path_dataset"], config)
131    if t != s + 1 and not success:
132        return (False, np.identity(4), np.identity(6))
133    if config["debug_mode"]:
134        print(transformation)
135        print(information)
136    return (True, transformation, information)
137

This script uses the technique demonstrated in /tutorial/pipelines/multiway_registration.ipynb. The function update_posegraph_for_scene builds a pose graph for multiway registration of all fragments. Each graph node represents a fragment and its pose which transforms the geometry to the global space.

Once a pose graph is built, the function optimize_posegraph_for_scene is called for multiway registration.

63    pose_graph = o3d.io.read_pose_graph(pose_graph_name)

Main registration loop

The function make_posegraph_for_scene below calls all the functions introduced above. The main workflow is: pairwise global registration -> multiway registration.

167    pcd_down.estimate_normals(
168            register_point_cloud_pair)(ply_file_names, matching_results[r].s,
169                                       matching_results[r].t, config)
170                                              for r in matching_results)
171        for i, r in enumerate(matching_results):
172            matching_results[r].success = results[i][0]
173            matching_results[r].transformation = results[i][1]
174            matching_results[r].information = results[i][2]
175    else:
176        for r in matching_results:
177            (matching_results[r].success, matching_results[r].transformation,
178                    matching_results[r].information) = \
179                    register_point_cloud_pair(ply_file_names,
180                    matching_results[r].s, matching_results[r].t, config)
181
182    for r in matching_results:
183        if matching_results[r].success:
184            (odometry, pose_graph) = update_posegraph_for_scene(
185                matching_results[r].s, matching_results[r].t,
186                matching_results[r].transformation,
187                matching_results[r].information, odometry, pose_graph)
188    o3d.io.write_pose_graph(
189        join(config["path_dataset"], config["template_global_posegraph"]),
190        pose_graph)
191
192
193def run(config):
194    print("register fragments.")
195    o3d.utility.set_verbosity_level(o3d.utility.VerbosityLevel.Debug)
196    ply_file_names = get_file_list(
197        join(config["path_dataset"], config["folder_fragment"]), ".ply")
198    make_clean_folder(join(config["path_dataset"], config["folder_scene"]))
199    make_posegraph_for_scene(ply_file_names, config)
200    optimize_posegraph_for_scene(config["path_dataset"], config)

Results

The pose graph optimization outputs the following messages:

[GlobalOptimizationLM] Optimizing PoseGraph having 14 nodes and 42 edges.
Line process weight : 55.885667
[Initial     ] residual : 7.791139e+04, lambda : 1.205976e+00
[Iteration 00] residual : 6.094275e+02, valid edges : 22, time : 0.001 sec.
[Iteration 01] residual : 4.526879e+02, valid edges : 22, time : 0.000 sec.
[Iteration 02] residual : 4.515039e+02, valid edges : 22, time : 0.000 sec.
[Iteration 03] residual : 4.514832e+02, valid edges : 22, time : 0.000 sec.
[Iteration 04] residual : 4.514825e+02, valid edges : 22, time : 0.000 sec.
Current_residual - new_residual < 1.000000e-06 * current_residual
[GlobalOptimizationLM] total time : 0.003 sec.
[GlobalOptimizationLM] Optimizing PoseGraph having 14 nodes and 35 edges.
Line process weight : 60.762800
[Initial     ] residual : 6.336097e+01, lambda : 1.324043e+00
[Iteration 00] residual : 6.334147e+01, valid edges : 22, time : 0.000 sec.
[Iteration 01] residual : 6.334138e+01, valid edges : 22, time : 0.000 sec.
Current_residual - new_residual < 1.000000e-06 * current_residual
[GlobalOptimizationLM] total time : 0.001 sec.
CompensateReferencePoseGraphNode : reference : 0

There are 14 fragments and 52 valid matching pairs among the fragments. After 23 iterations, 11 edges are detected to be false positives. After they are pruned, pose graph optimization runs again to achieve tight alignment.