File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Evaluating and Improving Neural Program-Smoothing-based Fuzzing

TitleEvaluating and Improving Neural Program-Smoothing-based Fuzzing
Authors
Issue Date20-Sep-2022
Abstract

Fuzzing nowadays has been commonly modeled as an optimization problem, e.g., maximizing code coverage under a given time budget via typical search-based solutions such as evolutionary algorithms. However, such solutions are widely argued to cause inefficient computing resource usage, i.e., inefficient mutations. To address this issue, two neural program-smoothing-based fuzzers, Neuzz and MTFuzz, have been recently proposed to approximate program branching behaviors via neural network models, which input byte sequences of a seed and output vectors representing program branching behaviors. Moreover, assuming that mutating the bytes with larger gradients can better explore branching behaviors, they develop strategies to mutate such bytes for generating new seeds as test cases. Meanwhile, although they have been shown to be effective in the original papers, they were only evaluated upon a limited dataset. In addition, it is still unclear how their key technical components and whether other factors can impact fuzzing performance. To further investigate neural program-smoothing-based fuzzing, we first construct a large-scale benchmark suite with a total of 28 popular open-source projects. Then, we extensively evaluate Neuzz and MTFuzz on such benchmarks. The evaluation results suggest that their edge coverage performance can be unstable. Moreover, neither neural network models nor mutation strategies can be consistently effective, and the power of their gradient-guidance mechanisms have been compromised. Inspired by such findings, we propose a simplistic technique, PreFuzz, which improves neural program-smoothing-based fuzzers with a resource-efficient edge selection mechanism to enhance their gradient guidance and a probabilistic byte selection mechanism to further boost mutation effectiveness. Our evaluation results indicate that PreFuzz can significantly increase the edge coverage of Neuzz/MTFuzz, and also reveal multiple practical guidelines to advance future research on neural program-smoothing-based fuzzing.


Persistent Identifierhttp://hdl.handle.net/10722/333859
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWu, Mingyuan-
dc.contributor.authorJiang, Ling-
dc.contributor.authorXiang, Jiahong-
dc.contributor.authorZhang, Yuqun-
dc.contributor.authorYang, Guowei-
dc.contributor.authorMa, Huixin-
dc.contributor.authorNie, Sen-
dc.contributor.authorWu, Shi-
dc.contributor.authorCui, Heming-
dc.contributor.authorZhang, Lingming-
dc.date.accessioned2023-10-06T08:39:40Z-
dc.date.available2023-10-06T08:39:40Z-
dc.date.issued2022-09-20-
dc.identifier.urihttp://hdl.handle.net/10722/333859-
dc.description.abstract<p>Fuzzing nowadays has been commonly modeled as an optimization problem, e.g., maximizing code coverage under a given time budget via typical search-based solutions such as evolutionary algorithms. However, such solutions are widely argued to cause inefficient computing resource usage, i.e., inefficient mutations. To address this issue, two neural program-smoothing-based fuzzers, Neuzz and MTFuzz, have been recently proposed to approximate program branching behaviors via neural network models, which input byte sequences of a seed and output vectors representing program branching behaviors. Moreover, assuming that mutating the bytes with larger gradients can better explore branching behaviors, they develop strategies to mutate such bytes for generating new seeds as test cases. Meanwhile, although they have been shown to be effective in the original papers, they were only evaluated upon a limited dataset. In addition, it is still unclear how their key technical components and whether other factors can impact fuzzing performance. To further investigate neural program-smoothing-based fuzzing, we first construct a large-scale benchmark suite with a total of 28 popular open-source projects. Then, we extensively evaluate Neuzz and MTFuzz on such benchmarks. The evaluation results suggest that their edge coverage performance can be unstable. Moreover, neither neural network models nor mutation strategies can be consistently effective, and the power of their gradient-guidance mechanisms have been compromised. Inspired by such findings, we propose a simplistic technique, PreFuzz, which improves neural program-smoothing-based fuzzers with a resource-efficient edge selection mechanism to enhance their gradient guidance and a probabilistic byte selection mechanism to further boost mutation effectiveness. Our evaluation results indicate that PreFuzz can significantly increase the edge coverage of Neuzz/MTFuzz, and also reveal multiple practical guidelines to advance future research on neural program-smoothing-based fuzzing.<br></p>-
dc.languageeng-
dc.relation.ispartof44th International Conference on Software Engineering- ICSE 2022 (22/05/2022-27/05/2022, Pittsburgh)-
dc.titleEvaluating and Improving Neural Program-Smoothing-based Fuzzing-
dc.typeConference_Paper-
dc.identifier.doi10.1145/3510003.3510089-
dc.identifier.scopuseid_2-s2.0-85133542481-
dc.identifier.volume2022-May-
dc.identifier.spage847-
dc.identifier.epage858-
dc.identifier.isiWOS:000832185400069-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats