go back
go back
Volume 18, No. 5
E2ETune: End-to-End Knob Tuning via Fine-tuned Generative Language Model
Abstract
Database knob tuning is a significant challenge for database ad- ministrators, as it involves tuning a large number of configuration knobs with continuous or discrete values to achieve optimal data- base performance. Traditional methods, such as manual tuning or learning-based approaches, typically require numerous work- load replays and are both time-consuming and resource-intensive. To address this challenge, we introduce E2ETune, an end-to-end knob tuner powered by a fine-tuned generative language model. The key idea is to leverage the exceptional sequence-to-sequence modeling capabilities of generative language models to capture the complex mapping between workloads (inputs) and their correspond- ing promising configurations (outputs). To achieve this goal, we propose a novel data generation framework to efficiently produce a large amount of training data, where each data sample consists of a workload and its promising configuration. Then, these data are used to fine-tune a generative language model, yielding an end- to-end knob tuner. This tuner offers out-of-the-box configuration recommendations for new workloads. We conduct extensive exper- iments to evaluate E2ETune’s efficiency and effectiveness using 10 representative and 3 real-world benchmarks. Compared to state-of- the-art methods, E2ETune can identify competitive configurations in significantly less time.
PVLDB is part of the VLDB Endowment Inc.
Privacy Policy