Choosing a course is one of the most important decisions you'll ever make! View our courses and see what our students and lecturers have to say about the courses you are interested in at the links below.
Each year more than 4,000 choose NUI Galway as their University of choice. Find out what life at NUI Galway is all about here.
About NUI Galway
About NUI Galway
Since 1845, NUI Galway has been sharing the highest quality teaching and research with Ireland and the world. Find out what makes our University so special – from our distinguished history to the latest news and campus developments.
Colleges & Schools
Colleges & Schools
NUI Galway has earned international recognition as a research-led university with a commitment to top quality teaching across a range of key areas of expertise.
- Business & Industry
- Alumni, Friends & Supporters
At NUI Galway, we believe that the best learning takes place when you apply what you learn in a real world context. That's why many of our courses include work placements or community projects.
MA500 Geometric Foundations of Data Analysis
Geometry is concerned with questions of shape, size, relative position of figures, isometry, and the properties of space. It underlies much of data analysis, as can be seen from textbooks such as those by [Kendall], [Le Roux], [Kirby], [Tossdorff], [Hartmann], [Outot], [Tierny], [Edelsbrunner], [Patrangenaru], [Biau], [Wichura], [Dryden]. The recent online testbook Mathematical Foundations for Data Analysis by Jeff M. Phillips again emphasizes the importance of a geometric understanding of techniques when applying them to data analysis.
This module focuses on some geometric methods used in data analysis. It covers the geometric and algorithmic aspects of these methods, as well as their implementation as Python code on Linux computers and their application to a range of different types of data. The first half of the course emphasizes geometric aspects of classical techniques such as least squares fitting, principal component analysis, hierarchical clustering, nearest neighbour searching, and the Johnson-Lindenstrauss Theorem for dimensionality reduction. The second half of the course covers more recent techniques that have been developed over the last two or three decades, and emphasizes topological aspects as well as geometric aspects.
Part I: Classical Techniques (5 ECTS)
- Least Squares Fitting
- Principal Component Analysis
- Hierarchical Clustering and Persistence
- Nearest Neighbours and the Johnson–Lindenstrauss Theorem
- Topological Preliminaries
- Mapper Clustering
- Persistent Homology
- Fundamental Group
- Lecturer: Graham Ellis & Emil Sköldberg
Mon 10.00am, GE, ADB1020
Tue 12.00m, GE, ADB1020
Wed 10.00am, ES, ADB1019 (or ADB1020)
Fri 14.00pm, ES, ADB1019 (or IT206)
- Tutorials: Wednesday and Friday lectures will often take the format of a tutorial and so no formal tutorials are scheduled.
- Recomended text: Part I is based on chapters from the textbook Multivariate Analysis by Sir Maurice Kendall and chapters from the online textbook Mathematical Foundations for Data Analysis by Jeff M. Phillips . Part II is based on the survey An introduction to Topological Data Analysis: fundamental and practical aspects for data scientists by Frédéric Chazal and Bertrand Michel .
- Problem sheet: available here. (A list of exam-type problems for self-study is available here.)
- Module Website: Information and module documents will be posted to this site, which is linked from the Blackboard MA500 Geometric Foundations of Data Analysis pages. Blackboard will also be used for announcements and for posting grades.
Part I will be assessed by a 2-hour written exam (52%) and three continuous assessment assignments (16% each).
Part II will be assessed by a 2-hour written exam (50%) and two continuous assessment assignments (25% each).
Each exam will consist of four questions, with full marks for four correct answers.
Each assignment will consist of a data analysis problem that needs to be tackled using the Python programming language, and submitted (by email to both lecturers) as a PDF document.
None so far.
Emil's Lecture NotesThese will be posted here.
Exam DetailsA guide to what to expect on the MA500 exams can be found here.
(Click number to download notes for the lecture.)
||Began by explaining the terms "geometry", "data analysis", "statistics", "probability". Then explained how to find the line y=b0 + b1 x that "best fits" a collection of data points (xi,yi) for i=1,2,...,n. We took "best fit" to mean the line that minimizes the sum of the squares of the residuals
ei = yi - b0 - b1xi .
||Explained how to determine the best (in the least squares sense) plane that fits data points (yi,xi,1 + ... + xi,p-1) in Rp for i=1,2, ..., n.
Also explained how to determine the best (in the least squares sense) polynomial of degree at most d that fits data points in R2.
Introduced the coefficient of determination
Next lecture we'll show that 0<= R2 <= 1. We'll also give the formula for R2 in matrix notation.
||Proved that the coefficient of determination R2 satisfies 0<= R2 <= 1 for p=2.
Then gave the matrix notation for computing R2 for p>=2 and stated, without proof, that for p>=2 again 0<= R2 <=1. One often says that "R2 percent of the variation is explained by considering the p-1 independent variables".
Gave a method, involving the F-distribution, for choosing between the two hypotheses
C1: βi=0 for all i=1,...,p-1.
C2: βi is non-zero for at least one i.
In Lecture 3:
on page 4 I give the correct formula SSR=B^rX^tY - n ybar^2 .
on page 5 I give the correct formula MSR=SSR/(p-1)
But on page 6 I make a silly slip and write MSR=(YtY - BtXtY)/(p-1) instead of MSR=(BtXtY - n ybar2)/(p-1).
However, on page 6 the value MSR=26922 is cogged from the book and so should be correct.
||Explained how to obtain Bonferroni simultaneous confidence intervals for q of the coefficients βk in the regression model.
Then started to talk about Principal Component Analysis. (The second homework will use this technique to analyse vectors v1, ..., vn representing n digital images of faces).
||Explained that the aim of Principal Component Analysis is to find an orthogonal matrix A for which ACAt is diagonal, where C is the covariance matrix of your data points v1, ..., vn in Rp.
||Gave an example of PCA in gait analysis.
Then started towards a proof of the Spectral Theorem.
||Proved the spectral theorem: any real symmetric nxn matrix has n linearly independent eigenvectors.
This is the basis of Principal Component Analysis
||Gave an introduction to hierarchical cluster analysis, dendrograms, and barcodes. Illustrated these notions through two examples: 1) clustered five objects for which pairwise distances were given; 2) counted the number of objects in the following digital photo, and counted the number of these objects that have holes in them.
During the lecture I illustrated dendrroagrams and barcodes using the GAP software system for computational algebra. The code for reproducing the examples is here.
||Described the Smith-Waterman algorithm for measuring the similarity s(V,W) between two sequences V, W of letters. It finds a maximal scoring local alignment and returns the score.
By scaling, we can assume that 0<= s(V,W) <= 1 on any finite set of data. We can then define the dissimilarity measure
In general we do not suppose that d(V,W) satisfies all three axioms of a metric. But when it does we should expect it to be more easily interpreted.
Showed the Clustal Omega online resource for determining the (dis)similarity of genetic sequences, and for returning the output in the form of a dendrogram (or phylogenetic tree).
||Began by explaining how cluster analysis and barcodes can be used to investigate the geometric shape of a data set S in Rn. A computer demonstration was given for a subset of points S in R2 corresponding to the following digital image of a starfish.
The corresponding barcode is
Then described a single-likage hierarchical clustering algoithm for producing a barcode from the matrix of distances/dissimilarities between n objects.
||Introduced the nearest neighbour problem through two examples. Then considered a two-step strategy for an efficient solution. Part of this strategy involves Voronoi tessellations of Euclidean space and so ended up talking about Voronoi tessellations and Voronoi regions (with 2-dimensional illustrations).