Performance matters — but achieving reproducible performance is harder than it first appears, especially in cloud environments where hardware variability is common. Modern systems introduce unpredictable factors that can make traditional benchmarking noisy and unreliable. In this talk, you’ll learn about Paired Benchmarking: a practical method for measuring algorithms’ performance side-by-side, under identical conditions, to get clearer, more trustworthy results.
Speaker

Denis Bazhedov
Denis Bazhenov has more than 15 years of experience designing and building distributed systems in multiple languages, including Rust. He developed and scaled an e-commerce search platform that grew by a factor of 500 over ten years, balancing user experience demands with hardware cost constraints. His work emphasizes system design, mechanical sympathy, performance optimization, and software testability, with a deep interest in benchmarking techniques. Outside of work, Denis enjoys skiing, snowboarding, kitesurfing, and reading.