Revisiting Neural Program Smoothing for Fuzzing

Dec 5, 2023

Speakers

About

Testing with randomly generated inputs (fuzzing) has gained significant traction due to its capacity to expose program vulnerabilities automatically. Fuzz testing campaigns generate large amounts of data, making them ideal for the application of machine learning (ML). Neural program smoothing, a specific family of ML-guided fuzzers, aims to use a neural network as a smooth approximation of the software under test for new test case generation. In this paper, we conduct the most extensive benchmark of neural program smoothing (NPS) fuzzers against standard gray-box fuzzers (11 CPU years and &gt\;5.5 GPU years), and make the following contributions: (1) We find that the original performance claims for NPS fuzzers do not hold, and proceed to investigate the reasons why\; we uncover and elucidate fundamental, implementation, and experimental limitations of prior works. (2) We contribute the first in-depth analysis of the contribution of machine learning and gradient-based mutations in NPS. (3) As we demonstrate in a prototype called Neuzz++, addressing the practical limitations of NPS fuzzers improves performance, but standard gray-box fuzzers almost always surpass NPS-based fuzzers. (4) As a consequence, we propose new guidelines targeted at benchmarking fuzzing based on machine learning, and present a platform, MLFuzz, with GPU access for easy and reproducible evaluation of ML-based fuzzers. Neuzz++, MLFuzz, and all our data are available as open source.

Organizer

Categories

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ESEC-FSE