banner
KendyZ

KendyZ

feedId:65592353402595328+userId:54400481666870272
twitter
github
bilibili

Record of a Fuzzing tcpdump + DFSan Problem Solving Process

Cause#

Here's what happened.

I wanted to reproduce the results of the paper NestFuzz, so I found the GitHub repository provided by the authors: fdu-sec/NestFuzz. Among the programs evaluated in the experimental section of this paper that may have vulnerabilities, I was most interested in tcpdump, while the README in the repository mentions tiff. Although the configuration and running processes should be quite similar, it is expected that there may indeed be some subtle differences.

Following the instructions in the README, I first compiled the fuzzer for NestFuzz:

cd NestFuzz
make

At this point, the compiled afl-fuzz program has been generated in the NestFuzz directory. Next, I needed to compile the modeling component of NestFuzz, located in the ipl-modeling directory. According to another README in the directory, I configured llvm-10 and the Rust toolchain for compilation.

  1. Build llvm-10
apt-get install -y xz-utils cmake ninja-build gcc g++ python3 doxygen python3-distutils
wget https://github.com/llvm/llvm-project/releases/download/llvmorg-10.0.0/llvm-project-10.0.0.tar.xz
tar xf llvm-project-10.0.0.tar.xz
mkdir llvm-10.0.0-install
cd llvm-project-10.0.0
mkdir build
cd build
CC=gcc CXX=g++ cmake -G Ninja -DCMAKE_BUILD_TYPE=Release -DLLVM_TARGETS_TO_BUILD=X86 -DLLVM_ENABLE_PROJECTS="clang;clang-tools-extra;libcxx;libcxxabi;lldb;compiler-rt" -DCMAKE_INSTALL_PREFIX=/path/to/llvm-10.0.0-install -DCMAKE_EXE_LINKER_FLAGS="-lstdc++" ../llvm
ninja install
# install rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"

# other dependencies
apt install git zlib1g-dev python-is-python3 -y

# llvm
export LLVM_HOME=/path/to/llvm-10.0.0-install
export PATH=$LLVM_HOME/bin:$PATH
export LD_LIBRARY_PATH=$LLVM_HOME/lib:$LD_LIBRARY_PATH

./build.sh

Then I needed to compile the target. I cloned the latest version from the-tcpdump-group/tcpdump on the master branch and built the executable programs for fuzzing and modeling.

  1. Build the target for fuzzing
cp -r tcpdump tcpdump-fuzzer
cd tcpdump-fuzzer
./autogen.sh
CC=/path/to/NestFuzz/afl-gcc CXX=/path/to/NestFuzz/afl-g++ ./configure
make -j$(nproc)
  1. Next, build the target for modeling. Unlike the previous step, the CC and CXX compilers used in this step need to utilize the test-clang and test-clang++ programs generated in the modeling directory. These two programs wrap clang and insert some additional parameters to implement dynamic data flow analysis by passing in DFSan's pass.

image.png

Problem Occurs#

From here, things began to develop in a direction different from the steps provided in the README.

cp -r tcpdump tcpdump-model
cd tcpdump-model
./autogen.sh
CC=/path/to/NestFuzz/ipl-modeling/install/test-clang CXX=/path/to/NestFuzz/ipl-modeling/install/test-clang++ ./configure
make -j$(nproc)

During the configure step, an error occurred:

image.png

The last error in config.log is as follows:

image.png

During the ld linking process, it was found that the symbol dfs$pcap_loop being used does not exist. Initially, I thought the word dfs might mean deep-first-search and had no other special meaning, implying that the missing symbol was actually the pcap_loop variable or function. In fact, pcap_loop is a function provided by the-tcpdump-group/libpcap. Therefore, it is likely that the libpcap package was not found on the computer during linking.

I tried to install libpcap-dev using apt, and after installation, I was able to use pcap-config to generate compilation options, indicating that libpcap was installed correctly. However, configure still reported the same error.

I struggled for an entire day, even trying to change the system environment from debian to ubuntu as used in the paper, but it still didn't work. Until I realized that dfs might be an abbreviation for DFSan, or at least related to DFSan.

The two passes loaded by clang: libLoopHandlingPass and libDFSanPass, can be found in the source code under the llvm-10 directory, specifically LoopHandlingPass.cpp and DataFlowSanitizer.cpp. A quick search reveals that it is the latter that adds dfs$ before the pcap_loop function name.

image.png

DFSan Principle#

DFSan implements dynamic data flow analysis at the pass level of LLVM by rewriting code at function calls, operators, and other locations to achieve taint tracking. Taint tracking is the process of tagging specified variables with taint labels, so that when the value of this variable is used by other instructions during program execution—such as being passed as a parameter to a function, assigned to a variable, or added to a number—the return value of the function or the affected variable will also be tagged.

The principle of DFSan's taint tracking is actually quite simple (LLVM's documentation on DFSan). For function parameters and return values, DFSan rewrites the function's signature by adding a label parameter corresponding to each parameter at the end of the parameter list, and additionally returns a corresponding label variable for the return value.

DFSan allows developers to provide an ABI List to specify the rewriting behavior for the involved functions:

image.png

image.png

Regarding the reason for adding a dfs$ prefix, DFSan also explains:

Note: In LLVM-10/11, the method of adding a dfs$ prefix was adopted, while in the latest LLVM-19, it has changed to adding the suffix .dfsan.

image.png

The linker only matches symbol names during linking and does not "carefully" select the called function definitions like the compiler does with function overloading (although some compilers may include the parameter list and return value in the symbol name in an abbreviated form). Therefore, for the same function name, it becomes impossible to distinguish which ones have been processed by DFSan and which have not. If DFSan does not rewrite the symbol name and the program views the apt-installed pcap_loop function from the perspective of being processed by DFSan, it will inevitably lead to a mismatch in the number of parameters.

In other words, when the clang compiler compiles the tcpdump source code into LLVM IR and then processes it with DataFlowSanitizer, all references to the pcap_loop function in the source code have been rewritten to dfs$pcap_loop, while the libpcap library installed via apt only contains the pcap_loop symbol and not dfs$pcap_loop, because this libpcap was not processed by DFSan during compilation.

It is also possible to simulate the behavior of the configure script and replicate this process. The configure script generates a smaller source file named conftest.c, which is compiled and linked to check whether a certain function exists in a certain library. When checking for the pcap_loop function in libpcap, the content of conftest.c is as follows:

/* confdefs.h */
#define PACKAGE_NAME "tcpdump"
#define PACKAGE_TARNAME "tcpdump"
#define PACKAGE_VERSION "5.0.0-PRE-GIT"
#define PACKAGE_STRING "tcpdump 5.0.0-PRE-GIT"
#define PACKAGE_BUGREPORT "https://github.com/the-tcpdump-group/tcpdump/issues"
#define PACKAGE_URL ""
#define STDC_HEADERS 1
#define HAVE_SYS_TYPES_H 1
#define HAVE_SYS_STAT_H 1
#define HAVE_STDLIB_H 1
#define HAVE_STRING_H 1
#define HAVE_MEMORY_H 1
#define HAVE_STRINGS_H 1
#define HAVE_INTTYPES_H 1
#define HAVE_STDINT_H 1
#define HAVE_UNISTD_H 1
#define HAVE_FCNTL_H 1
#define HAVE_NET_IF_H 1

/* end confdefs.h.  */
/* Define $2 to an innocuous variant, in case <limits.h> declares $2.
   For example, HP-UX 11i <limits.h> declares gettimeofday.  */
#define pcap_loop innocuous_pcap_loop

/* System header to define __stub macros and hopefully few prototypes,
    which can conflict with char $2 (); below.
    Prefer <limits.h> to <assert.h> if __STDC__ is defined, since
    <limits.h> exists even on freestanding compilers.  */

#ifdef __STDC__
# include <limits.h>
#else
# include <assert.h>
#endif

#undef pcap_loop

/* Override any GCC internal prototype to avoid an error.
   Use char because int might match the return type of a GCC
   builtin and then its argument prototype would still apply.  */
#ifdef __cplusplus
extern "C"
#endif
char pcap_loop();
/* The GNU C library defines this for functions which it implements
    to always fail with ENOSYS.  Some functions are actually named
    something starting with __ and the normal name is an alias.  */
#if defined __stub_pcap_loop || defined __stub___pcap_loop
choke me
#endif

int
main ()
{
return pcap_loop();
  ;
  return 0;
}

First, compile it into an LLVM text IR file, then run the pass with opt to generate the DFSan-processed IR file, and then directly open this IR file with an editor to see that the reference to pcap_loop has been rewritten:

clang -S -emit-llvm conftest.c -o conftest.ll
opt -load ../ipl-modeling/install/pass/libLoopHandlingPass.so -load ../ipl-modeling/install/pass/libDFSanPass.so -chunk-dfsan-abilist=../ipl-modeling/install/rules/dfsan_abilist.txt -o conftest_dfsan.ll conftest.ll -chunk-dfsan-abilist=../ipl-modeling/install/rules/angora_abilist.txt -S -dfsan_pass

image.png

Solution#

Download the source code of libpcap and use libDFSanPass to rewrite the function definition of pcap_loop during compilation. Here, I can directly use the modeling module's test-clang, and the libpcap repository has a build.sh, so there's no need to manually configure and make.

CC=/home/ubuntu/NestFuzz/ipl-modeling/install/test-clang CXX=/home/ubuntu/NestFuzz/ipl-modeling/install/test-clang++ ./build.sh

After compiling libpcap, I can use nm to confirm whether the full name of the pcap_loop symbol has changed.

image.png

Since the directory for compiling libpcap is only for the subsequent compilation of tcpdump, there's no need to perform make install. The configure script for tcpdump will look for the libpcap directory in the parent directory. However, the --disable-local-libpcap in the build.sh script under the tcpdump directory needs to be removed so that the configure script can find and use libpcap from the parent directory. Recompiling tcpdump with test-clang should work!

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.