Image Compression Performance Using Total Variation Minimization and Noiselet Transform

Final Report Presentation

Final Report PDF

I worked on this project for a class at Missouri S&T called Machine Vision. For the project I recreated some of the work done in [1] comparing image compression techniques to illustrate the benefits of compressive sampling. The technique demonstrated in [1] showed that random noiselets could be used for compression using l1 minimization recovery, and that this technique could actually yield better visual results than the typical discrete cosine transform alone. Oddly enough I showed that the benefits of using this technique did not hold when tested on images other than the demonstration used in the article. I was certainly surprised by this result!

[1] Romberg, J. (2008). Imaging via Compressive Sampling. IEEE Signal Processing Magazine, 25(2), 14{20. doi:10.1109/MSP.2007.914729

Advertisements

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering

Final Report Presentation

Final Report PDF

I created this as a final project for a course at Missouri S&T called Statistical Decision Theory. I implemented the EM-GMM algorithm [1] in Matlab and compared the results with the built-in k-means function. EM-GMM is cool because it builds a generative model of the data, so you can use it for clustering if you make certain assumptions or you can use it to understand something about the data without actually looking at the data itself.

It was a challenging course because I don’t have a background in communications besides the introductory courses every EE takes, but we got some freedom for the final project that I really enjoyed!

[1] A. P. Dempster, N. M. Laird, and D. B. Rubin, \Maximum likelihood from incomplete data via the em algorithm,”Journal of the royal statistical society. Series B (methodological), pp. 1-38,1977.