MSc Thesis presentation of Mr. Kostas Mparmparousis, Wednesday, October 22, 2025

//MSc Thesis presentation of Mr. Kostas Mparmparousis, Wednesday, October 22, 2025

MSc Thesis presentation of Mr. Kostas Mparmparousis, Wednesday, October 22, 2025

On Wednesday, October 22, 2025, at 14:00, Mr. Kostas Mparmparousis of the
postgraduate program “Data Science and Information Technologies”, track on
“Big Data and Artificial Intelligence”, will present his MSc thesis
titled: “What Drives Learned Optimizer Performance? A Systematic
Evaluation”

Abstract
Classic query optimization is a well-defined and supremely intelligent
process, a cornerstone of database management systems, infused with
decades of empirical knowledge and ingenuity. As traditional optimizers
inevitably begin to pay the toll the (few but impactful) limitations in
their design induce in this modern age of data processing, a new breed of
Learned Query Optimizers (LQOs) promises to surpass those shortcomings and
revolutionize the query execution landscape. Whilst research on this field
continues to flourish, a framework that places such systems under a
microscope and dissects them becomes the top priority, so that the
research community is informed on which factors drive their performance
and how to realize their immense potential.

In this thesis, we propose a systematic framework centered around five
core evaluation dimensions: performance, robustness, learning procedure,
internal decision-making and generalization. Afterwards we leverage said
framework in order to contrast the classic optimizer’s performance against
five of the most prominent LQO implementations. Our findings show that a)
in terms of outright performance the LQOs find the largest windows of
opportunity in settings where the classic optimizer is destined to fail,
b) regression-based LQOs are extremely sensitive to different workload
orders and maximize their effectiveness when exposed to a ascending
complexity curriculum, c) how training progress is tracked has an
immediate effect on model convergence speed and stability, d) LQO success
stems directly from their model’s architecture and how its embedding space
is structured, and finally e) the classic optimizer is still the go-to
solution to almost anything generalization-related.

EXAMINATION COMMITTEE:
Dr. Georgia Koutrika, Research Director, Information Management Systems
Institute, Athena Research Center
Prof. Nikos Mamoulis, Department of Computer Science and Engineering,
University of Ioannina
Assistant Prof. Vasilis Efthymiou, Department of Informatics and
Telematics, Harokopio University of Athens

24 October 2025
14:00 – 15:30 (GTB)
Join Microsoft Teams Meeting
https://teams.microsoft.com/l/meetup-join/19%3ameeting_OWI5NDQzODEtODNmYS00ZTkyLWI2NzktYjE0Y2U2NDRhNzI5%40thread.v2/0?context=%7b%22Tid%22%3a%226ae07702-c5f7-4f38-9b87-acad62a75d93%22%2c%22Oid%22%3a%22a532b5cb-e712-4935-a736-a15e0a5058ef%22%7d
Meeting ID: 314 577 906 602 1
Passcode: 3ae7HZ37

By |2025-10-19T19:49:42+00:00October 19th, 2025|DSIT|0 Comments

About the Author:

Leave A Comment