-
Notifications
You must be signed in to change notification settings - Fork 1
/
CreateProbeDetectionModel.m
498 lines (458 loc) · 18.7 KB
/
CreateProbeDetectionModel.m
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
%% Probe Object Detection Model Creation
% Associate user-provided measurements of the probe with points marked on
% the image of the probe, such that the image colours can be used to detect
% the probe in other images.
%
% ## Usage
% Modify parameters and paths to input data in the first code section
% below, then run.
%
% ## Probe Design Assumptions
% The probe is a cylindrically-symmetric object composed of conical or
% cylindrical segments. Ideally, it is perfectly straight. (Some thin
% objects, such as wooden rods, may appear straight, but actually be
% curved and therefore not cylindrically symmetrical.) It may taper to a
% point at one or both ends, although only one end will be used as the
% active tip.
%
% The length of the probe should be divided into different-coloured
% bands, and the pattern of bands should be asymmetrical, such that it is
% possible to uniquely determine the orientation of the probe, even if
% some of the probe is occluded. Specifically, asymmetry of band lengths
% is assumed, as opposed to colour pattern asymmetry. Band edges should
% be perpendicular to the probe's axis of cylindrical symmetry. It should
% not be possible for the probe to self-occlude any of the edges between
% bands.
%
% At each junction between two coloured bands, the coloured bands should
% have the same widths.
%
% Note: All bands should have strong colours. Those with unsaturated
% colours can be given colour labels of zero, such that they will be
% ignored. Unsaturated colours cannot be reliably detected, and will also
% interfere with the correct detection of the other colours.
%
% ## Input
%
% ### Probe measurements
% A '.mat' file containing a structure called 'probe' with the following
% fields:
% - lengths: A vector of distances of edges of bands from the active end of
% the probe, including a distance of 0.0 for the active end of the
% probe, and a distance for the other end of the probe (i.e. the length
% of the entire probe). Distances are measured along the probe's axis
% of cylindrical symmetry, and are listed in order starting from the
% active end of the probe.
% - colors: A vector with `length(lengths) - 1` elements specifying colour
% indices for the bands of the probe. Colour indices for colours that
% are to be detected should be consecutive integers starting at 1.
% Colours given indices of zero will be ignored. Indices indicate how
% the bands of the probe are grouped based on mutually-distinguishable
% colours, allowing bands to have non-unique colours. The specific
% index assigned to a given band is unimportant. For instance, the
% first band need not have a colour index of 1. The elements of
% `colors` should correspond to adjacent pairs of elements in `lengths`
% (i.e. The colour indices should be in order starting from the active
% end of the probe).
% - widths: Width (diameter) of the probe at the edges of bands. Widths
% must include the ends of the probe, with values of zero or
% approximate tip diameters for ends that taper to points. (Probe tip
% widths are currently unused, so their values do not matter.) The
% elements of 'widths' should correspond to elements of 'lengths'.
%
% Units are arbitrary, but should be consistent with the units of any
% partial reconstruction of an object that the probe is used to refine, and
% with the units used for camera calibration.
%
% ### Image of probe
% An RGB image in any format that can be loaded by `imread` showing the
% entire portion of the probe that the user has provided measurements for.
% The image should depict the probe under lighting conditions that are
% similar to the detection scenario, to facilitate colour-based detection.
%
% The image should not contain significant distortion, or it will be
% difficult to automatically determine the orientation (forwards vs.
% backwards) of the probe in the image.
%
% ### Annotations for probe image
% An image in any format that can be loaded by `imread` with an alpha
% channel that is nonzero at user-marked interest points, and zero
% everywhere else. The image should be based on the same image of the probe
% provided above.
%
% Interest points can be marked by nonzero alpha channel regions that are
% more than one pixel in size - A morphological shrink operation will be
% used to extract single pixel locations from them. Optionally (depending
% on the parameter variable `annotation_corner_search_width` below), a
% corner feature detector will then be used to refine the locations of
% interest points within a search window.
%
% A single interest point should be marked for each end of the probe that
% tapers to a point. Two interest points should be marked for the edges of
% coloured bands on the probe, corresponding to their intersections with
% the probe's contour in the image.
%
% ### Colour noise parameters
% A '.mat' file containing a 'rgb_sigma_polyfit' variable, as output by the
% script '.\EstimateRGBStandardDeviations.m'. 'rgb_sigma_polyfit' describes
% the variation in RGB channel standard deviations with RGB values in the
% image. This information should be computed from images taken under the
% same conditions and with the same camera parameters as the image used for
% probe modeling, if not computed from the same image used for probe
% modeling.
%
% ## Output
%
% ### Probe detection model
% A '.mat' file containing the following variables:
% - 'probe': A copy of the 'probe' variable loaded from the probe
% measurements file, for convenience.
% - 'probe_color_distributions_kernel': Discretized variable kernel density
% estimators of image hue values corresponding to the different coloured
% bands on the probe, in the same order (starting from the active tip of
% the probe). The i-th column of this 2D array stores the estimator for
% the i-th colour class of probe segments.
% - 'probe_color_classifier_kernel': A classifier for image hue values,
% created from 'probe_color_distributions_kernel' using
% `mlDiscreteClassifier`. A uniform hue distribution is assumed for the
% image background colours.
% - 'probe_color_distributions_gaussian': The equivalent of
% 'probe_color_distributions_kernel', using Gaussian density estimators
% instead of variable kernel density estimators.
% - 'probe_color_classifier_gaussian': A classifier for image hue values,
% created from 'probe_color_distributions_gaussian' using
% `mlDiscreteClassifier`. A uniform hue distribution is assumed for the
% image background colours.
%
% Additionally, the output file contains the values of all parameters in
% the first section of the script below, for reference. (Specifically,
% those listed in `parameters_list`, which should be updated if the set of
% parameters is changed.)
%
% ## Notes
% - To find the approximate value of the i-th hue estimator at a query
% value 'x', where 'x' must be in the range of hues ([0, 1]), use:
%
% ```
% queryDiscretized1DFunction(...
% x, probe_color_distributions_*(:, i),...
% hueSamplingParams(probe_color_distributions_*(:, i))...
% )
% ```
%
% ## References
% - T. Gevers and H. Stokman. "Robust Histogram Construction from Color
% Invariants for Object Recognition". IEEE Transactions on Pattern
% Analysis and Machine Intelligence, vol. 26, no. 1, pp. 113-118, Jan.
% 2004.
% Bernard Llanos
% Spring 2016 research assistantship supervised by Dr. Y.H. Yang
% University of Alberta, Department of Computing Science
% File created July 26, 2016
%% Input data and parameters
% List of parameters to save with results
parameters_list = {
'model_filename',...
'I_filename',...
'I_annotations_filename',...
'rgb_sigma_filename',...
'annotation_corner_search_width',...
'point_alignment_outlier_threshold',...
'subject_gap_cost',...
'query_gap_cost',...
'affine_weight',...
'saturation_threshold',...
'probe_color_distribution_resolution'
};
% Probe measurements
model_filename = '';
% Image of probe
I_filename = '';
% Annotations for image of probe
I_annotations_filename = '';
% RGB noise parameters
rgb_sigma_filename = '';
% Annotation extraction parameters
annotation_corner_search_width = 0; % Set to zero to use centers of user-marked annotations as opposed to nearby corner features
% Parameters for interpreting annotated points
point_alignment_outlier_threshold = 5;
% Parameters for matching annotated points with the probe measurements
subject_gap_cost = -0.1;
query_gap_cost = 0;
affine_weight = 0;
% Saturation threshold. Pixels below this saturation will be excluded from
% colour estimator creation.
saturation_threshold = 0.25;
% Number of points at which to evaluate colour estimators
probe_color_distribution_resolution = 180;
% Debugging tools
display_original_image = false;
display_annotations_image = false;
display_extracted_annotations = false;
display_model_from_image = true;
verbose_point_sequence_matching = false;
display_probe_band_masks = false;
display_probe_color_masks = true;
display_hue_image = false;
display_saturation_image = true;
plot_hue_estimators = true;
plot_hue_classifiers = true;
%% Load images and obtain adjusted centers of user-marked annotations
I = imread(I_filename);
image_width = size(I, 2);
image_height = size(I, 1);
image_n_channels = size(I, 3);
if image_n_channels ~= 3
error('A 3-channel RGB image is required.')
end
if display_original_image
figure; %#ok<UNRCH>
imshow(I);
title('Base image');
end
[~, ~, Ialpha] = imread(I_annotations_filename);
if display_annotations_image
figure; %#ok<UNRCH>
imshow(Ialpha);
title('User-marked annotations');
end
if annotation_corner_search_width
[ interest_points, feature_search_windows, corners ] = extractInterestPoints( Ialpha, I, annotation_corner_search_width );
else
interest_points = extractInterestPoints( Ialpha ); %#ok<UNRCH>
end
if display_extracted_annotations
figure; %#ok<UNRCH>
I_grey = rgb2gray(I);
imshow(max(I_grey, Ialpha));
hold on
plot(interest_points(:, 1), interest_points(:, 2), 'ro')
if annotation_corner_search_width
title('Final marked interest points (red o) and other corner features');
for rect = feature_search_windows'
rectangle('Position', rect, 'EdgeColor', 'b');
end
for i = 1:length(corners)
if ~isempty(corners{i})
plot(corners{i});
end
end
else
title('Extracted interest points (red o)');
end
hold off
end
%% Model the interest points as a series of bands on a linear probe object
[...
~, image_lengths, model_from_image_axes, model_from_image...
] = bilateralModel(...
interest_points, point_alignment_outlier_threshold, true...
);
if display_model_from_image
fg = figure; %#ok<UNRCH>
imshow(I);
plotBilateralModel( model_from_image, model_from_image_axes, [image_height, image_width], [], fg);
title('Classified interest points (blue, black = tips; green, red = above/below first PCA component; yellow = unmatched)');
end
% The probe tips are marked with single points. All other probe segments
% must be marked with a point along each side of the probe.
%
% Since we are performing calibration, failure to satisfy this condition is
% an error. In a detection scenario, we would just eliminate the outlier
% points and continue.
if ~isempty(model_from_image.unmatched)
error(sprintf(['Not all probe colour band junctions are marked with two points - One on each edge of the probe.\n',...
'Consider increasing the outlier detection threshold used when pairing points.'])); %#ok<SPERR>
end
%% Match model extracted from the image to the user-supplied measurements of the probe
load(model_filename, 'probe');
if ~exist('probe', 'var')
error('No variable called ''probe'' is loaded (which would contain probe measurements).')
end
image_to_measured_matches = matchProbeLengths(...
probe.lengths, image_lengths, subject_gap_cost, query_gap_cost,...
affine_weight, verbose_point_sequence_matching...
);
% Validate the matching, and flip the model extracted from the image if necessary
if any(~image_to_measured_matches)
error('Some interest points in the image were not matched to known probe measurements.')
else
image_to_measured_matches_expected = (1:length(image_to_measured_matches)).';
aligned_forward = all(image_to_measured_matches == image_to_measured_matches_expected);
aligned_reverse = all(image_to_measured_matches == flipud(image_to_measured_matches_expected));
if ~xor(aligned_forward, aligned_reverse)
error('Failed to find an ordered one-to-one mapping between interest points in the image and known probe measurements.')
elseif aligned_reverse
model_from_image_new = model_from_image;
if isfield(model_from_image, 'head')
model_from_image_new.tail = model_from_image.head;
elseif isfield(model_from_image_new, 'tail')
model_from_image_new = rmfield(model_from_image_new, 'tail');
end
if isfield(model_from_image, 'tail')
model_from_image_new.head = model_from_image.tail;
elseif isfield(model_from_image_new, 'head')
model_from_image_new = rmfield(model_from_image_new, 'head');
end
model_from_image_new.above = flipud(model_from_image_new.above);
model_from_image_new.below = flipud(model_from_image_new.below);
model_from_image = model_from_image_new;
end
end
%% Locate regions corresponding to probe bands and probe colours
% Express the model in pixel coordinates as polygonal sections
n_bands = length(probe.lengths) - 1;
above = model_from_image.above;
below = model_from_image.below;
model_from_image_polygons = cell(n_bands, 1);
start_offset = 0;
if isfield(model_from_image, 'head')
head = model_from_image.head;
model_from_image_polygons{1} = [
head(1:2);
above(1, 1:2);
below(1, 1:2)
];
start_offset = 1;
end
for i = 1:(size(above, 1) - 1)
model_from_image_polygons{i + start_offset} = [
above(i:(i+1), 1:2);
below((i+1):-1:i, 1:2)
];
end
if isfield(model_from_image, 'tail')
tail = model_from_image.tail;
model_from_image_polygons{end} = [
above(end, 1:2);
tail(1:2);
below(end, 1:2);
];
end
% Obtain a mask for each band
probe_band_masks = false(image_height, image_width, n_bands);
for i = 1:n_bands
probe_band_masks(:, :, i) = roipoly(...
I, model_from_image_polygons{i}(:, 1), model_from_image_polygons{i}(:, 2)...
);
end
if display_probe_band_masks
for i = 1:n_bands %#ok<UNRCH>
figure
imshow(probe_band_masks(:, :, i));
title(sprintf('Mask for probe band %d', i))
end
end
% Obtain hue and saturation values
[H, S] = rgb2hs(I);
S_mask = (S >= saturation_threshold);
% Group bands by colour
colors_filter = (probe.colors ~= 0);
probe_colors = unique(probe.colors(colors_filter));
if any(diff(probe_colors) ~= 1)
error('Probe bands must be given colour labels that are consecutive integers.')
end
n_colors = length(probe_colors);
probe_color_masks = false(image_height, image_width, n_colors);
for i = 1:n_bands
if colors_filter(i)
probe_color_masks(:, :, probe.colors(i)) =...
probe_color_masks(:, :, probe.colors(i)) | probe_band_masks(:, :, i);
end
end
if display_probe_color_masks
for i = 1:n_colors %#ok<UNRCH>
figure
imshowpair(probe_color_masks(:, :, i), probe_color_masks(:, :, i) & S_mask, 'montage');
title(sprintf('Mask for probe colour %d (left), intersected with saturation threshold (right)', i))
end
end
for i = 1:n_colors
probe_color_masks(:, :, i) = probe_color_masks(:, :, i) & S_mask;
end
%% Create photometric invariant representations of the probe colours
if display_hue_image
figure %#ok<UNRCH>
H_color = ones(image_height, image_width, image_n_channels);
H_color(:, :, 1) = H;
H_color = hsv2rgb(H_color);
imshowpair(H, H_color, 'montage');
title('Hue channel of original image')
end
if display_saturation_image
figure %#ok<UNRCH>
imshowpair(S, S_mask, 'montage');
title(sprintf([
'Saturation channel of original image (left)',...
', thresholded (>= %g) (right)'
], saturation_threshold...
));
end
% Compute hue density estimators from probe colors
load(rgb_sigma_filename, 'rgb_sigma_polyfit');
if ~exist('rgb_sigma_polyfit', 'var')
error('No variable called ''rgb_sigma_polyfit'' is loaded (which would contain the camera RGB noise model).')
end
probe_color_distributions_kernel = zeros(...
probe_color_distribution_resolution, n_colors...
);
probe_color_distributions_gaussian = zeros(...
probe_color_distribution_resolution, n_colors...
);
I_double = im2double(I);
R = I_double(:, :, 1);
G = I_double(:, :, 2);
B = I_double(:, :, 3);
for i = 1:n_colors
probe_color_distributions_kernel(:, i) = hueVariableKernelDensityEstimator(...
H, R, G, B, probe_color_masks(:, :, i),...
rgb_sigma_polyfit, probe_color_distribution_resolution...
);
probe_color_distributions_gaussian(:, i) = hueGaussianDensityEstimator(...
H, probe_color_masks(:, :, i), probe_color_distribution_resolution...
);
end
if plot_hue_estimators
legend_names = cell(n_colors, 1); %#ok<UNRCH>
for i = 1:n_colors
legend_names{i} = sprintf('Probe color %d', i);
end
plotHueDensityEstimator(...
probe_color_distributions_kernel, legend_names...
);
title('Hue variable kernel density estimators for colors on the probe')
plotHueDensityEstimator(...
probe_color_distributions_gaussian, legend_names...
);
title('Hue Gaussian density estimators for colors on the probe')
end
%% Create a hue classifier
h_inc = hueSamplingParams( probe_color_distribution_resolution );
probe_color_classifier_kernel = mlDiscreteClassifier(...
probe_color_distributions_kernel, h_inc, 'periodic'...
);
probe_color_classifier_gaussian = mlDiscreteClassifier(...
probe_color_distributions_gaussian, h_inc, 'periodic'...
);
if plot_hue_classifiers
plotHueClassifier(...
probe_color_classifier_kernel,...
n_colors...
);
title('Hue variable kernel distribution classifier for colors on the probe')
plotHueClassifier(...
probe_color_classifier_gaussian,...
n_colors...
);
title('Hue Gaussian distribution classifier for colors on the probe')
end
%% Save results to a file
save_variables_list = [ parameters_list, {...
'probe',...
'probe_color_distributions_kernel',...
'probe_color_classifier_kernel',...
'probe_color_distributions_gaussian',...
'probe_color_classifier_gaussian'...
} ];
uisave(save_variables_list,'probeDetectionModel')
disp('Reminder: The output model is specific to the probe, camera, and camera parameters.')