Dimensionality reduction with Multidimensional Scaling(MDS)

Hi everybody,
This past Summer, I was given the position of Research Assistant at IU Computer Vision Lab. I just started out on my first research-based position and was pretty excited after reading the paper my professor gave me and then we went straight into drawing data visualizations after doing analysis and dimensionality reduction using Multidimensional Scaling(MDS). I couldn’t find the link to the actual research paper he gave me, but I’m going to post another one that talks about a similar proposal.

The Research Paper 👇:
http://vision.soic.indiana.edu/papers/activeviewing2016cogsci.pdf

I’ll make it quick here and then expand later on what MDS really is, so I was trying to use MDS, a dimensionality reduction technique to study the relationships between an image dataset, in my case, the image dataset contained over 2000+ random landscape shots and was obtained from Kaggle.

The initial approach was to extract pixel values from the data and then do MDS plots, and then later on we moved onto a pre-trained Resnet18 model to extract deeply learned features and then study the data with an MDS plot. This journey went on pretty good and I will post some images below:

MDS was performed on the dataset by extracting pixel values.
MDS was performed on the dataset by extracting deeply learned feature values.

Both of the visualizations look pretty similar, probably because the main clustering was done around the color of the images, I’m not sure yet, and we’re working further to identify better patterns in our dataset. However, we did try another technique called t-SNE, which displayed somewhat better results, but more on that in another blog post.

For now, coming back to what MDS really is, from a less math-intensive POV. MDS is somewhat similar to PCA(Principal Component Analysis), for those who don’t know about PCA, I'll post a link to a helpful YT video for it. PCA is also responsible for dimensionality reduction but it does its clustering by maximizing the linear correlations between all the points in our dataset, however, MDS performs clustering by minimizing the distances between all our points in our dataset. Although, maximizing the linear correlations is similar to minimizing the distances hence MDS might generate a graph similar to the PCA, but luckily MDS doesn’t use the Euclidean distance formula to calculate its distances. MDS uses the log fold change method to calculate the distances in the sample, which results in a much different graph than PCA. If you want to learn more about MDS, I’ll drop another link to a helpful YT video.

Link to the PCA video: https://www.youtube.com/watch?v=FgakZw6K1QQ
Link to the MDS video: https://www.youtube.com/watch?v=GEn-_dAyYME

Hopefully will write again soon :)

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Danishjeet Singh

Danishjeet Singh

CS student at IUB, loves graphic design and web dev. Currently doing some research in Computer Vision. Check out more at singhdan.me