Fall 2024 CS543/ECE549
Assignment 2: ScaleSpace Blob Detection
Due date: Wednesday, October 9, 11:59:59 PM
The goal of this assignment is to implement a Laplacian blob detector as discussed in this lecture.
Data and starter code
This zip file contains starter code, four test images, and sample output images for reference. Keep in mind that your output may look different depending on your
threshold, range of scales, and other implementation details. In addition
to the images provided, also run your code on at least four images of your own
choosing.
Algorithm outline
 Generate a Laplacian of Gaussian filter.
 Build a Laplacian scale space, starting with some initial scale and
going for n iterations:
 Filter image with scalenormalized Laplacian at current scale.
 Save square of Laplacian response for current level of scale space.
 Increase scale by a factor k.
 Perform nonmaximum suppression in scale space.
 Display resulting circles at their characteristic scales.
Detailed instructions
 As always, don't forget to convert images to grayscale and rescale the intensities between 0 and 1.
 For creating the Laplacian filter, use the scipy.ndimage.filters.gaussian_laplace function.
Pay careful attention to setting the right filter mask size.
 It is relatively inefficient to repeatedly filter
the image with a kernel of increasing size. Instead of increasing
the kernel size by a factor of k, you should downsample the image by a factor
1/k. In that case, you will have to upsample the result or do some
interpolation in order to find maxima in scale space. For full credit,
you need to produce both implementations: one that increases filter size,
and one that downsamples the image. In your report, list the running times
for both versions of the algorithm and discuss differences (if any) in the
detector output. For timing, use time.time().
Hint 1:
Consider whether you still need to scalenormalize the filter when you downsample the image.
Hint 2: Use skimage.transform.resize to help preserve the intensity values of the array.
 You have to choose the initial scale, the factor k by which the scale
is multiplied each time, and the number of levels in the scale space.
I typically set the initial scale to 2, and use 10 to 15 levels in the
scale pyramid. The multiplication factor should depend on the largest scale
at which you want regions to be detected.
 You may want to use a threedimensional array to represent your
scale space. It would be declared as follows:
scale_space = numpy.empty((h,w,n)) # [h,w]  dimensions of image, n  number of levels in scale space
Then scale_space[:,:,i] would give you the ith level of the scale space.
Alternatively, if you are storing different levels of the scale pyramid at different
resolutions, you may want to use an NumPy object array, where each "slot" can accommodate a
different data type or a matrix of different dimensions. Here is how you would use it:
scale_space = numpy.empty(n, dtype=object) # creates an object array with n "slots"
scale_space[i] = my_matrix # store a matrix at level i
 To perform nonmaximum suppression in scale space, you should first do
nonmaximum suppression in each 2D slice separately. For this, you may find
functions scipy.ndimage.filters.rank_filter or scipy.ndimage.filters.generic_filter useful.
Play around with these functions, and try to find the one that works the fastest.
To extract the final nonzero values (corresponding to detected regions),
you may want to use the numpy.clip function.
 You also have to set a threshold on the squared Laplacian response above
which to report region detections. You should
play around with different values and choose one you like best. To extract values above the threshold,
you could use the numpy.where function.
 To display the detected regions as circles, you can use
this function
(or feel free to search for a suitable Python function or write your own).
Hint: Don't forget that there is a multiplication factor
that relates the scale at which a region is detected to the radius of the
circle that most closely "approximates" the region.

Implement the differenceofGaussian pyramid as
mentioned in class and described in David Lowe's paper.
Compare the results and the running time to the direct Laplacian implementation.
 Implement the affine adaptation step to turn circular blobs into
ellipses as shown in the lecture (just one iteration is sufficient).
The selection of the correct window function is essential here. You should use
a Gaussian window that is a factor of 1.5 or 2 larger than the characteristic scale
of the blob. Note that the lecture slides show how to find the relative shape of the
second moment ellipse, but not the absolute scale (i.e., the axis lengths are defined
up to some arbitrary constant multiplier). A good choice for the absolute
scale is to set the sum of the major and minor axis halflengths to the
diameter of the corresponding Laplacian circle. To display the resulting
ellipses, you should modify the circledrawing function or look for a better
function in the matplotlib documentation or on the Internet.
 The Laplacian has a strong response not only at blobs, but also along
edges. However, recall from the class lecture that edge points are not
"repeatable". So, implement an additional thresholding step that computes
the Harris response at each detected Laplacian region and rejects the regions
that have only one dominant gradient orientation (i.e., regions along edges).
If you have implemented the affine adaptation step, these would be the
regions whose characteristic ellipses are close to being degenerate (i.e.,
one of the eigenvalues is close to zero).
Show both "before" and "after" detection results.
Submission Instructions
Same as Assignment 1, you must upload the following files to Canvas:
 Your code and output images in a single zip file. The filename should be netid_a2.zip.
 Your report with all your results and discussion following this template. The filename should be netid_a2.pdf.
Note that the images in the zip file are for backup documentation only, in case we cannot see the images in your PDF report clearly enough. You will not receive
credit for any output images that are part of the zip file but are not shown directly in the report PDF.
Please refer to course policies on academic honesty, collaboration, late days, etc.
Further References
