ReF Decompile

This model is a fine-tuned version of LLM4Binary/llm4decompile-6.7b-v1.5.

Results

Model/Metrics Re-executability Rate (%) Readability (#)
O0O1O2O3AVG O0O1O2O3AVG
Rule Based Decompiler
ghidra 34.7616.4615.2414.0220.12 2.982.412.522.382.57
Refine-Based Method
GPT-4o 46.9534.1528.6631.1035.22 2.822.352.292.312.44
LLM4Decompile-Ref 74.3946.9547.5642.0752.74 4.083.383.343.193.50
End-to-End Method
LLM4Decompile-End 69.5144.5139.6338.4148.02 4.073.463.403.233.54
FAE Decompile 67.6848.7845.7342.0751.07 3.943.463.403.253.51
FAE Decompile+SCC 70.2448.5447.5643.2952.41 3.973.483.413.233.52
ReF Decompile 85.3756.1051.8352.4361.43 4.133.603.543.493.69

Resources

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for ylfeng/ReF-Decompile-lora

Adapter
(1)
this model

Collection including ylfeng/ReF-Decompile-lora