(Look at image on right from very close, then from far away.)
Project 1: Image Filtering and Hybrid Images
CS 143: Introduction to Computer Vision
Brief
- Due: 11:59pm on Friday, September 20th, 2013
- Stencil code:< tt>/course/cs143/asgn/proj1/code/
- Data:/course/cs143/asgn/proj1/data/
- Html writeup template:/course/cs143/asgn/proj1/html/
- Project materials are also available in proj1.zip (1.9 MB).
- Handin: cs143_handin proj1
- Required files: README, code/, html/, html/index.html
Overview
The goal of this assignment is to write an image filtering function and use it to create hybrid images using a simplified version of the SIGGRAPH 2006 paper by Oliva , Torralba, and Schyns. Hybrid images are static images that change in interpretation as a fun ction of the viewing distance. The basic idea is that high frequency tends to dominate perception when it is available, but, at a distance, only the low frequency (smooth) part of the signal can be seen. By blending the high frequency portion of one image with the low-frequency portion of another, you get a hybrid image that leads to different interpretations at different distances.
Details
This project is intended to familiarize you with MATLAB and image filtering. Once you have created an image filtering function, it is relatively straightforward to construct hybrid images. If you don’t already know MATLAB, you will find this tutorial on MATLAB helpful.
Image Filtering. Image filtering (or convolution) is a fundamental image processing tool. See chapter 3.2 of Szeliski and the lecture materials to learn about image filtering (specifically linear filtering ). MATLAB has numerous built in and efficient functions to perform image filtering, but you will be writing your own such function from scratch for this assignment. More specifically, you will implement my_imfilter()
which imitates the default behavior of the build in imfilter()
function. As specified in my_imfilter.m
, your filtering algorithm must (1) support grayscale and color images (2) support arbitrary shaped filters , as long as both dimensions are odd (eg 7×9 filters but not 4×5 filters) (3) pad the input image with zeros or reflected imag e content and (4) return a filtered image which is the same resolution as the input image. We have provided a script,proj1_test_filtering.m
, to help you debug your image filtering algorithm.
Hybrid Images. A hybrid image is the sum of a low-pass filtered version of the one image and a high-pass filtered version of a second image. There is a free parameter, which can be tuned for each image pair , which controls how much high frequency to remove from the first image and how much low frequency to leave in the second image. This is called the “cutoff-frequency”. In the paper it is suggested to use two cutoff frequencies (one tuned for each image) and you are free to try that, as well. In the starter code, the cutoff frequency is controlled by changing the standard deviation of the Gausian filter used in construc ting the hybrid images.
We provide you with 5 pairs of aligned images which can be merged reasonably well into hybrid images. The alignment is important because it affects the perceptual grouping (read the paper for details). We encourage you to create additional examples (eg change of expression, morph between different objects, change over time, etc.). See the hybrid images project page for some inspiration.
For the example shown at the top of the page, the two original images look like this:< /p>
The low-pass (blurred ) and high-pass versions of these images look like this:
The high frequency image is actually zero-mean with negative values so it is visualized by adding 0.5. In the resulting visualization, bright values are positive and dark values are negative.
Adding the high and lo w frequencies together gives you the image at the top of this page. If you’re having trouble seeing the multiple interpretations of the image, a useful way to visualize the effect is by progressively downsampling the hybrid image as is done below:
p>
The starter code provides a funct ion vis_hybrid_image.m
to save and display such visualizations.
Potentially useful MATLAB functions: fspecial()
and the operators in the MATLAB tutorial which make it efficient to cut out image subwindows and do the convolution (dot product) between them. padarray()
.
Forbidden functions you can use for testing, but not in your final code: imfilter()
, filter2()
, conv2()
, nlfilter()
, colfilt()
.
Bells & Whistles (Extra Points)
< p style="margin-top:0px; margin-left:1em; fon t-family:Georgia,'New Century Schoolbook',Times,serif; font-size:15px; line-height:22.5px"> For later projects there will be more concrete extra credit suggestions. It is possible to get extra credit for this project, as well, if you come up with some clever extensions which impress the TAs.
Writeup< /h2>
For this project, and all other projects, you must do a project report in HTML. We provide you with a placeholder .html document which you can edit. In the report you will describe your algorithm and any decisions you made to write your algorithm a particular way. Then you will show and discuss the results of your algo rithm. In the case of this project, show the results of your filtering algorithm (the test script saves such images already) and show some of the intermediate images in the hybrid image pipeline (eg the low and high frequency images, which the starter code already saves for you). Also, discuss anything extra you did. Feel free to add any other information you feel is relevant.
Rubric
- +50 pts: Working implementation of image filtering in
my_imfilter.m
< /li> - +30 pts: Working hybrid image generation
- +20 pts: Writeup with several examples of hybrid images
- +10 pts: Extra credit (up to ten points)
- -5*n pts: Lose 5 points for every time (after the first) you do not follow the instructions for the hand in format
Web-Publishing Results
All the results for each project will be put on the course website so that the students can see each other’s results. The professor and TA will select “winning” projects that impress us and there will be in class presentations for these projects. If you do not want your results published to the web, you can choose to opt out. If you want to opt out, email cs143tas[at]cs.brown.edu saying so.
Handing in
This is very important as you will lose points if you do not follow instructions. Every time after the first that you do not follow instructions, you will lose 5 points. The folder you hand in must contain the following :
- README-text file containing anything about the project that you want to tell the TAs
- code/-directory containing all your code for this assignment
- html/-directory containing all your html report for this assignment (including images). Only this folder will be published to the course web page, so your webpage cannot contain pointers to images in other folders of your handin.
- html/index.html-home page for your results
Then run:cs143_handin proj1
If it is not in your path, you can run it directly:/course/cs143/bin/cs143_handin proj1 tt>
Credits
Assignment dev eloped by James Hays based on a similar project by Derek Hoiem.
from: http://cs.brown.edu/courses/cs143/proj1/