DiffTune Revisited: A Simple Baseline for Evaluating Learned llvm-mca ParametersDownload PDF

Published: 30 May 2022, Last Modified: 05 May 2023MLArchSys 2022Readers: Everyone
Abstract: Recent work introduced DiffTune, a neural network-based technique to automatically learn the microarchitecture-specific parameters of basic block CPU simulators. The authors apply their approach to the llvm-mca simulator. They show that the learned parameter values achieve an accuracy on the BHive benchmark suite that is comparable, and in some cases even better than that of the original, expert-provided llvm-mca parameter values. In this paper, we show that an accuracy in this range is actually trivial to achieve: We propose a simple set of parameter values that outperforms the values learned by DiffTune. In fact, our set of parameter values is so simple that it can be fully described within this abstract: We set the dispatch width to 4, the reorder buffer size to 100, the latencies and μop counts of all instructions to 1, and all other parameters to 0. These parameter values lead to more accurate predictions than DiffTune's values on all four microarchitectures that were considered in the DiffTune paper. We then develop a simple learning algorithm for the llvm-mca parameters. We show that the parameter values learned by our algorithm lead to an average error on the BHive benchmark suite that is between 29% and 47% lower compared to DiffTune's values.
4 Replies

Loading