bhogan commited on
Commit
4e9e487
·
verified ·
1 Parent(s): d5e22a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -12,7 +12,7 @@ base_model:
12
  **qqWen-32B-RL** is a 32-billion parameter language model specifically designed for advanced reasoning and code generation in the Q programming language. Built upon the robust Qwen 2.5 architecture, this model has undergone a comprehensive three-stage training process: pretraining, supervised fine-tuning (SFT), and reinforcement learning (RL) for the Q programming language.
13
  **qqWen-32B-RL** is a reasoning model.
14
 
15
- **Associated Technical Report**: [Link to paper will be added here]
16
 
17
  ## 🔤 About Q Programming Language
18
 
@@ -35,4 +35,14 @@ Q is a high-performance, vector-oriented programming language developed by Kx Sy
35
 
36
  If you use this model in your research or applications, please cite our technical report.
37
 
38
- ```
 
 
 
 
 
 
 
 
 
 
 
12
  **qqWen-32B-RL** is a 32-billion parameter language model specifically designed for advanced reasoning and code generation in the Q programming language. Built upon the robust Qwen 2.5 architecture, this model has undergone a comprehensive three-stage training process: pretraining, supervised fine-tuning (SFT), and reinforcement learning (RL) for the Q programming language.
13
  **qqWen-32B-RL** is a reasoning model.
14
 
15
+ **Associated Technical Report**: [Report](https://arxiv.org/abs/2508.06813)
16
 
17
  ## 🔤 About Q Programming Language
18
 
 
35
 
36
  If you use this model in your research or applications, please cite our technical report.
37
 
38
+ ```
39
+ @misc{hogan2025technicalreportfullstackfinetuning,
40
+ title={Technical Report: Full-Stack Fine-Tuning for the Q Programming Language},
41
+ author={Brendan R. Hogan and Will Brown and Adel Boyarsky and Anderson Schneider and Yuriy Nevmyvaka},
42
+ year={2025},
43
+ eprint={2508.06813},
44
+ archivePrefix={arXiv},
45
+ primaryClass={cs.LG},
46
+ url={https://arxiv.org/abs/2508.06813},
47
+ }
48
+ ```