Upload Glaurung Large 001 - RoBERTa large model for binary analysis
Browse files
README.md
CHANGED
|
@@ -17,12 +17,14 @@ widget:
|
|
| 17 |
|
| 18 |
# Glaurung Large 001
|
| 19 |
|
| 20 |
-
A RoBERTa-based masked language model trained on binary executable files for security research and binary analysis.
|
| 21 |
|
| 22 |
## Overview
|
| 23 |
|
| 24 |
**Glaurung Large 001** is a transformer model specifically designed for understanding binary executable files. It uses a custom BPE (Byte Pair Encoding) tokenizer trained on multi-byte patterns from various binary formats across multiple architectures (x86-64, ARM64, etc.) and operating systems (Linux, Alpine, Ubuntu, Debian, Rocky).
|
| 25 |
|
|
|
|
|
|
|
| 26 |
### Key Features
|
| 27 |
- **Custom Binary Tokenizer**: BPE tokenizer that creates efficient multi-byte tokens from binary data
|
| 28 |
- **Binary-Aware**: Trained on actual executable files, not hex strings
|
|
@@ -38,6 +40,7 @@ A RoBERTa-based masked language model trained on binary executable files for sec
|
|
| 38 |
- **Attention Heads**: 16
|
| 39 |
- **Intermediate Size**: 4096
|
| 40 |
- **Vocabulary Size**: 65,536 tokens
|
|
|
|
| 41 |
- **Max Position Embeddings**: 520
|
| 42 |
- **Parameters**: ~371M
|
| 43 |
- **Special Tokens**:
|
|
@@ -49,6 +52,20 @@ A RoBERTa-based masked language model trained on binary executable files for sec
|
|
| 49 |
- `<|mask|>` (5): Mask token for MLM
|
| 50 |
- `<|unk|>` (6): Unknown token
|
| 51 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 52 |
## Performance Comparison vs Glaurung Small 001
|
| 53 |
|
| 54 |
| Metric | Glaurung Small 001 | Glaurung Large 001 | Improvement |
|
|
|
|
| 17 |
|
| 18 |
# Glaurung Large 001
|
| 19 |
|
| 20 |
+
A RoBERTa-based masked language model trained on binary executable files for security research and binary analysis. Part of the [Glaurung](https://github.com/mjbommar/glaurung) project: a modern reverse engineering framework with first-class AI integration.
|
| 21 |
|
| 22 |
## Overview
|
| 23 |
|
| 24 |
**Glaurung Large 001** is a transformer model specifically designed for understanding binary executable files. It uses a custom BPE (Byte Pair Encoding) tokenizer trained on multi-byte patterns from various binary formats across multiple architectures (x86-64, ARM64, etc.) and operating systems (Linux, Alpine, Ubuntu, Debian, Rocky).
|
| 25 |
|
| 26 |
+
This is the **large variant** (371M parameters, 24 layers) offering enhanced understanding of binary patterns. For faster inference, see [glaurung-small-001](https://huggingface.co/mjbommar/glaurung-small-001) (160M parameters).
|
| 27 |
+
|
| 28 |
### Key Features
|
| 29 |
- **Custom Binary Tokenizer**: BPE tokenizer that creates efficient multi-byte tokens from binary data
|
| 30 |
- **Binary-Aware**: Trained on actual executable files, not hex strings
|
|
|
|
| 40 |
- **Attention Heads**: 16
|
| 41 |
- **Intermediate Size**: 4096
|
| 42 |
- **Vocabulary Size**: 65,536 tokens
|
| 43 |
+
- **Tokenizer**: [binary-tokenizer-005](https://huggingface.co/mjbommar/binary-tokenizer-005)
|
| 44 |
- **Max Position Embeddings**: 520
|
| 45 |
- **Parameters**: ~371M
|
| 46 |
- **Special Tokens**:
|
|
|
|
| 52 |
- `<|mask|>` (5): Mask token for MLM
|
| 53 |
- `<|unk|>` (6): Unknown token
|
| 54 |
|
| 55 |
+
## Glaurung Ecosystem
|
| 56 |
+
|
| 57 |
+
This model is part of the **Glaurung** project ecosystem:
|
| 58 |
+
|
| 59 |
+
### 🔧 Main Project
|
| 60 |
+
- **[Glaurung](https://github.com/mjbommar/glaurung)** - A modern reverse engineering framework designed to replace Ghidra with first-class AI integration throughout the analysis pipeline. Built with Rust's performance and Python's accessibility, featuring AI agents integrated at every level from format detection to decompilation.
|
| 61 |
+
|
| 62 |
+
### 🤖 Model Family
|
| 63 |
+
- **[glaurung-large-001](https://huggingface.co/mjbommar/glaurung-large-001)** (this model) - 371M parameters, 24 layers
|
| 64 |
+
- **[glaurung-small-001](https://huggingface.co/mjbommar/glaurung-small-001)** - 160M parameters, 12 layers, faster inference
|
| 65 |
+
|
| 66 |
+
### 🔤 Tokenizer
|
| 67 |
+
- **[binary-tokenizer-005](https://huggingface.co/mjbommar/binary-tokenizer-005)** - 65K vocabulary BPE tokenizer trained on multi-byte patterns
|
| 68 |
+
|
| 69 |
## Performance Comparison vs Glaurung Small 001
|
| 70 |
|
| 71 |
| Metric | Glaurung Small 001 | Glaurung Large 001 | Improvement |
|