ワークショップ (国際) ChimeraTL: Transfer Learning in DBMS with Fewer Samples

Tatsuhiro Nakamori (Keio University), Shohei Matsuura, Takashi Miyazaki, Sho Nakazono, Taiki Sato, Takashi Hoshino (Cybozu Labs), Hideyuki Kawashima (Keio University)

3rd International Workshop on Databases and Machine Learning in Conjunction with ICDE 2024 (DBML 2024)


In the field of database management systems (DBMS), it is essential to build a performance prediction model with less data from the target environment, motivating the application of transfer learning. While some parameters in DBMS have similar effects on performance across different hardware environments, others can have varying effects that depend on underlying hardware limitations. Previous studies do not leverage this information to improve the transfer learning. We propose ChimeraTL, a novel method that accounts for different parameter types to enhance transfer learning. Our experiments demonstrate that ChimeraTL needs only 50% of samples that state-of-the-art methods require to minimize the prediction error to under 10%.

Paper : ChimeraTL: Transfer Learning in DBMS with Fewer Samples新しいタブまたはウィンドウで開く (外部サイト)