About Me
Motivated by research that makes a difference.
I am currently a graduate mathematical sciences student at the University of Oxford. I am particularly interested in how a combination of mathematical, statistical and computational techniques can be employed to solve many novel problems arising in biological systems.
My research/career ambitions are to be able to employ such methods to make insightful inference and allow informed decisions to be made that may have a substantial impact on meaningful issues. Such applications may range from developments in the treatment/understanding of a certain disease to explaining how collective complexity can arise from relatively simple interactions between individual biological entities
For more information please consult my CV or contact me directly.
Education
University of Oxford
MSc Mathematical Sciences
2020 - 2022
Information on the course can be found here
- I am a graduate student at Exeter College.
- I am currently studying towards a masters in mathematical sciences, specialising in modules with a background in biological systems and models of continuous media.
- Modules: Mathematical Physiology, Stochastic Models In Mathematical Genetics, Mathematical Mechanical Biology, Topics in Computational Biology, Solid Mechanics, Elasticity & Plasticity, Mathematical Geoscience.
- I am currently writing my dissertation titled 'The Geometry & Mechanics of Seashells', under supervision of Prof. Derek Moulton.
- I have been awarded the Roche UNIQ+ Graduate Scholarship by Roche, a world leader in the pharmaceutical industry.
University of Bath
BSc Mathematics
First-Class Honours
2017 - 2020
My official transcript can be found here.
- I studied for my undergraduate degree in mathematics at the University of Bath, specialising in applied mathematics. I graduated with a First-Class Honours.
- During my time at Bath I was a part of the Gold Scholarship Programme.
Work Experience
Digitally Transforming Farming
The farming industry has experienced a recent transition into the era of precision agriculture. With the availability of data skyrocketting in recent years,
the industry has been hit by a wave of interest and use cases that machine learning and computer vision techniques can be applied to.
Seasonal variation in the past has made farming an incredibly volatile market - the influence of machine learning has allowed for predictions to be made that can allow farmers to maximise their potential yields. By feeding in a constant stream of data giving insight on features ranging from soil pH to precipitation volumes, an increase of as much as 300% to a farmer’s profits can be expected.
Project Outline:
Our team was tasked with the problem of extracting the crop type contained within a specific field. We were given georeferenced satellite imagery of whole farms and access to
the company database containg vital information such as: field boundary coordinates, crop types by field ID, planting dates, etc. The problem was two-fold: a large part of the
process was to develop a data pre-processing pipeline that would mould data into a suitable format to train our ML models with. Once the data was ready for use, lots of work
was done on model selection, building neural network architectures and fine-tuning various hyperparamaters upon the evaluation of model performances. Two methods were used:
a convolutional neural network was employed in attempt to extract the most ‘natural’ features. A time series approach was also taken with the idea of mapping spectral footprints
to unique crop types.
Main Challenges:
Dealing with Big Data:
It was necessary to optimise the infastructure of the data pre-processing pipeline using multi-processing/multi-threading, etc. due to the size of data that needed handling.
A single raw data instance contained pixel values for a whole image across 4 spectral bands along with geospatial and semantic labelling. The model required a substatinal number
of data points (~1,000) across each class for the models to perform well, hence this was a key process in accelerating the stream of dataflow.
Pre-Processing Challenges:
A lot of pre-processing was needed to ensure that the data was in a format that would be susceptible to our model frameworks. The data needed to be labelled, filtered and linked
accordingly throughout the commpany database. We then needed to clip out specific fields from whole farm images by using the georeferencing data. The images needed to be downsampled -
model performance was largely unaffected with a relatively large amount of downsampling. New features such as the Normalised Difference Vegative Index, which is a metric characterising
the amount of biomass present in an image were calculated and found to be important features. It was necessary to standardise image arrays were to aid model performance: images were padded/cropped
to standard dimensions and pixel intensities were normalised. A data augmentation process was implemented to produce artificial data points to bulk up some classes with low data count and a principle
component analysis was conducted to reduce the dimensionality of the problem.
Project Presentations:
A presentation on the data processing pipeline can be found here
(Animation of GIFs will require reading with Adobe Acroread ).
A presentation on our trained model’s performance (Optimised for viewing in Microsoft PowerPoint) can be found here.
Research Experience
An overview of the project and its current state can be found here
Project Outline:
Lab based microscopy, at a non-specialist level involves the tedious task of focusing onto the
prepared sample. Often this process is needed to be replicated multiple times and can become incredibly time consuming.
Moreover, the human error involved in this task often results in a sub-optimal perception of absolute focus.
This is therefore a process that can be optimised by the aid of machine input.
Under supervision of the Micron research group at the University of Oxford,
I worked on early stage development of the MicroscoPi imaging platform.
The aim of the project was to produce a fully automated, robust, 3D printed
microscopy platform. Specifically, my task was to develop and implement an autonomous
autofocus module, that the user could run on any pre-prepared
sample slide, under any light source and readily obtain a focused sample perspective.
The crux of the problem was the reformulation of focus into a computer vision context.
Main Challenges:
No Reference Image:
One thing that made this project unique was having no ‘reference image’ to compare
against, since we needed to apply our module to any given prepared sample, and we
were starting from a completely unfocused image. Hence, a “no reference, pixel-based” method
(NRP) as commonly referenced in relevant literature was employed as the logical approach.
Quantifying Focus:
The main problem that we needed to address was how to devise a metric to
quantify how in/out of focus a sample is at each stage of the automated process. This
corresponded to characterising the image artefact of ‘blur’ at each stage of the focal refinement process. Two metrics
were developed, one using edge feature analysis and the other using Fourier spectra analysis.
Project Outcomes:
Project Report: A report detailing the progress that I made can be found here.
Project Presentation: A presentation of my research (Optimised for viewing in Microsoft PowerPoint) can be found here.
Structural Stress Analysis & Turbulent Flow in OpenFOAM
https://www.bath.ac.uk/research-institutes/institute-for-mathematical-innovation/June 2020 - present
Project Outline:
Turbulence is one of the most abundant, yet poorly understood phenomena in the physical world. Huge difficulty arises due to the inherent chaotic fluctations that
can be readily observed in even relatively simple flow scenarios. The most widely ueed analyitcal turbulence models are Navier-Stoke type models and decompose the flow
into a mean flow component and a stochastically fluctauting flow componenet. Averaging the resulting equations leads to a tractable model for the mean field flow. However,
there is a problem of closure of the Reynold’s stress term. Most commonly, the Bousinessq hypothesis is used to close the mean flow equations. This hypothesis is an energy
model that accounts for the energy dissipation in a cascade down the length scales of eddies. This allows us to relate the Reynold’s stress solely to the mean flow and
obtain a fully closed set of equations.
In most urban settings models for the mean flow are usually sufficient - the effect of stochastic fluctations is relatively mild. However, in higher turbulent scenarios,
(e.g. oceanic models) these models become insufficient.
In this project, I explore these techniques, and apply then in a computational fluid dynamics (CFD) framework.
The project proposal involved working alongside a structural engineering company and using OpenFOAM CFD turbulence modelling to conduct stress analysis on the boundaries of a given house structure, subject to a widely accepted model for the flow profile in the atmospheric boundary layer. I also decided to conduct some pedestrian analysis - looking at how the wind flow is distorted by the presence of the house structure.
Main Challenges:
Computational Demand:
One of the biggest challenges in CFD is the large computational demand asscoiated with running a simulation on a sufficiently refined computational mesh. Whilst a sufficiently
refined mesh is desireable to allow for adequate ‘blending’ to occur through the viscous sublayer that forms close to the solid boundaries, a trade off can be made in reducing the
mesh resolution and using interpolating ‘wall functions’, that are widely studied and proposed in the literature, and based on idealised experiments. This was a particularly
necessary approach, since I didn’t have access to a powerful machine.
Numerical Discretisation:
It was important to develop an understanding of how to translate the theory of fluid dynamics into a numerical framework. This included making decisions on the
most suitable finite volume schemes (FVS) to use for each operator and how to discretise the physical quantities: space, time, energy & angle. It was also necessary
to develop an understanding of exactly how the OpenFOAM software operates, for example, how each field is mapped through the geometry of successive cells in the mesh.
Project Outcomes:
Project Presentation: A presentation of my project can be found here.
The Magneto-Active Elastica - Coupling Classical Theories of Elasticity & Magnetism.
December 2021 - April 2022
Project Outline:
Traditionally, classical theories of elasticity and magnetism are studied in isolation. However, a recent emergence of exciting engineering applications has prompted the need
for a unified theory of the magneto-active elastica. An example application is to implement non-invasive surgery/drug delivery systems. In this context, microscale, hard-magnetic
‘robots’ that cannot faesibly be motorised due to the associated length scale are able to be locomoted deterministically via. an externally applied magnetic field.
The aim of this dissertation is two-fold. Firstly, to unify the theory of elastic rods with magnetism. This will be done by considering suitable conservation/balance laws and enrgy minimisation principles, which upon applying constitutive relationships specific to the material, will result in a closed, non-linear, differential system. After the governing equations have been set out, we solve for the steady state deflections under various classes of applied magnetic fields, which in principle can be mapped onto a target location in space when viewed as an inverse problem.