Begoña García Malaxechebarría

/beh-goh-nyah/

I am a fourth-year Ph.D. student in the Department of Mathematics at the University of Washington, coadvised by Dmitriy Drusvyatskiy and Courtney Paquette, and a Research Assistant at the Institute for Foundations of Data Science (IFDS). I completed my undergraduate degree in Mathematics at Universidad de Murcia in 2021, supervised by Ángel Ferrández.

My research focuses on the mathematical foundations and optimization of machine learning, with particular emphasis on analyzing and scaling limits of stochastic algorithms for large-scale problems in data science. I apply techniques from convex and non-smooth analysis, high-dimensional probability, and statistics.


CONFERENCE PROCEEDINGS

...       

The High Line: Exact Risk and Learning Rate Curves of Stochastic Adaptive Learning Rate Algorithms

Proceedings of NeurIPS, 2024    pdf    code   Abstract

Elizabeth Collins-Woodfin, Inbar Seroussi, Begoña García Malaxechebarría,
Andrew W. Mackenzie, Elliot Paquette, Courtney Paquette

We develop a framework for analyzing the training and learning rate dynamics on a large class of high-dimensional optimization problems, which we call the high line, trained using one-pass stochastic gradient descent (SGD) with adaptive learning rates. We give exact expressions for the risk and learning rate curves in terms of a deterministic solution to a system of ODEs. We then investigate in detail two adaptive learning rates – an idealized exact line search and AdaGrad-Norm – on the least squares problem. When the data covariance matrix has strictly positive eigenvalues, this idealized exact line search strategy can exhibit arbitrarily slower convergence when compared to the optimal fixed learning rate with SGD. Moreover we exactly characterize the limiting learning rate (as time goes to infinity) for line search in the setting where the data covariance has only two distinct eigenvalues. For noiseless targets, we further demonstrate that the AdaGrad-Norm learning rate converges to a deterministic constant inversely proportional to the average eigenvalue of the data covariance matrix, and identify a phase transition when the covariance density of eigenvalues follows a power law distribution. We provide our code for evaluation at https://github.com/amackenzie1/highline2024.

JOURNAL PUBLICATIONS

FULLY-DIVERSE LATTICES FROM RAMIFIED CYCLIC EXTENSIONS OF PRIME DEGREE

Int. J. Appl. Math. 33(6): 1009-1015, 2020    pdf   Abstract

J. Carmelo Interlando, Antonio A. Andrade, Begoña García Malaxechebarría, Agnaldo J. Ferrari, Robson R. de Araújo

Let \(p\) be an odd prime. Algebraic lattices of full diversity in dimension \(p\) are obtained from ramified cyclic extensions of degree \(p\). The \(3\), \(5\), and \(7\)-dimensional lattices are optimal with respect to sphere packing density and therefore are isometric to laminated lattices in those dimensions.

THESES

A separable Banach lattice that contains isometric copies of all others

UNDERGRADUATE THESIS, UNIVERSIDAD DE MURCIA, 2021    pdf

Begoña García Malaxechebarría - supervised by Antonio Avilés López and José David Rodríguez Abellán, awarded an honor distinction


TALKS

Ph.D. General Examination

University of Washington (UW)
November 2024

IFDS WEEKLY SEMINAR SERIES

University of Washington (UW)
November 2024

Graduate Student Analysis Seminar

University of Washington (UW)
October 2024

West Coast Optimization Meeting

University of British Columbia (UBC)
September 2024

CRM-PIMS SUMMER SCHOOL IN PROBABILITY

Université de Montréal (UdeM)
July 2024

PROJECT PORTFOLIO

...       

MonetGAN. Painting with Generative Adversarial Networks: Generating Monet-Style Images Using Novel Techniques

2023    pdf    code   Abstract

Garrett Devereux, Deekshita S. Doli, Begoña García Malaxechebarría

This project was developed as part of the graduate deep learning class at
University of Washington, instructed by Ranjay Krishna and Aditya Kusupati.

Generative Adversarial Networks (GANs) have emerged as a powerful and versatile tool for generative modeling. In recent years, they have gained significant popularity in various research domains, particularly in image generation, enabling researchers to address challenging problems by generating realistic samples from complex data distributions. In the art industry, where tasks often require significant investments of time, labor, and creativity, there is a growing need for more efficient approaches that can streamline subsequent project phases. To this end, we present MonetGAN, a pioneering framework based on the Least Squares Deep Convolutional CycleGAN, specifically tailored to generate images in the distinctive style of 19th-century renowned painter Claude Monet. After achieving non-trivial results with our baseline model, we explore the integration of advanced techniques and architectures such as ResNet Generators, a Progressive Growth Mechanism, Differential Augmentation, and Dual-Objective Discriminators. Through our research, we find that we can achieve superior results with small changes to gradually build up a successful model, rather than adding too many varying complexities all at once. These outcomes underscore the intricacies involved in the training of GANs, while also opening up promising avenues for further advancements at the intersection of art and artificial intelligence.

TEACHING

Teaching Assistant University of Washington

MATH 112: Application of Calculus to Business and Economics
MATH 120: Precalculus
MATH 124: Calculus with Analytic Geometry I
Fall 2021 - Spring 2023

Washington Directed Reading Program Mentor University of Washington

Directed an undergraduate’s independent research addressing physical problems involving ODEs
Winter 2023

Math Tutor Self-Employed

Guided students to academic excellence in projects, assignments, and university entrance exams
Summer 2019 - Spring 2021

Socio-Educational Volunteer Fundación FADE

Implemented educational support services in English, Spanish, and math aimed at at-risk youth
Fall 2015 - Spring 2018

AWARDS

Top Scholar Award University of Washington

Based on strong record and close fit to the department, it exempts the recipient from teaching responsibilities for one quarter
Fall 2021

Graduated with 21 honor distinctions Universidad de Murcia

Academic award granted to the top 5% students in each course that entails a tuition waiver for the following year
Spring 2021

Collaborative Research Fellowship Ministry of Education (Spain)

Fall 2020
Aimed to facilitate my undergraduate research in "Geometric Lattices" with the Differential and Convex Geometry research group of Universidad de Murcia, deemed an “Excellency Research Group” by Murcia’s regional government

Dean's List San Diego State University

Fall 2019 - Spring 2020
Academic recognition for good academic standing, and a term grade point average of 3.5 in a minimum of 12 graded units

Fellowship for Talented Students Ministry of Education (Spain)

Fall 2017
Awarded to high-achieving students recognized for their outstanding academic accomplishments

Institute for Foundations of Data Science    ...

The Institute for Foundations of Data Science (IFDS) addresses critical challenges in data science through collaboration among researchers from the Universities of Washington, Wisconsin-Madison, California-Santa Cruz, and Chicago. By engaging in research and community activities, the IFDS aims to enhance data science methodologies while promoting equity and inclusion.

Machine Learning and Optimization (ML-OPT) IFDS WEEKLY SEMINAR SERIES At University of Washington

The IFDS Weekly Seminar is currently co-organized by Begoña García Malaxechebarría and Avinandan Bose, and coordinated by
Maryam Fazel, providing an engaging venue for students and postdocs to present their recent research and work in progress.
The seminar takes place weekly, both in person and remotely. Below, you can check the Google Calendar for the schedule.
Additionally, you can join our mailing list for updates (UW login required).