Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -28,7 +28,7 @@ tokenizer = AutoTokenizer.from_pretrained( | |
| 28 | 
             
                    "meta-llama/Meta-Llama-3-8B", padding_side="left"
         | 
| 29 | 
             
                )
         | 
| 30 | 
             
            tokenizer.pad_token = tokenizer.eos_token
         | 
| 31 | 
            -
            text = " | 
| 32 | 
             
            inputs = tokenizer(text, return_tensors="pt", padding=True).to(device)
         | 
| 33 | 
             
            ```
         | 
| 34 |  | 
| @@ -43,7 +43,8 @@ output = SCAR.generate( | |
| 43 | 
             
                max_new_tokens=32,
         | 
| 44 | 
             
                pad_token_id=tokenizer.eos_token_id,
         | 
| 45 | 
             
            )
         | 
| 46 | 
            -
            print(tokenizer.decode(output[0], skip_special_tokens=True))
         | 
|  | |
| 47 | 
             
            ```
         | 
| 48 | 
             
            The example above will decrease toxicity. To increase the toxicity one would set `SCAR.hook.mod_scaling = 100.0`. To modify nothing simply set `SCAR.hook.mod_features = None`.
         | 
| 49 |  | 
|  | |
| 28 | 
             
                    "meta-llama/Meta-Llama-3-8B", padding_side="left"
         | 
| 29 | 
             
                )
         | 
| 30 | 
             
            tokenizer.pad_token = tokenizer.eos_token
         | 
| 31 | 
            +
            text = "You fucking film yourself doing this shit and then you send us"
         | 
| 32 | 
             
            inputs = tokenizer(text, return_tensors="pt", padding=True).to(device)
         | 
| 33 | 
             
            ```
         | 
| 34 |  | 
|  | |
| 43 | 
             
                max_new_tokens=32,
         | 
| 44 | 
             
                pad_token_id=tokenizer.eos_token_id,
         | 
| 45 | 
             
            )
         | 
| 46 | 
            +
            print(tokenizer.decode(output[0, -32:], skip_special_tokens=True))
         | 
| 47 | 
            +
            # ' the video. We will post it on our website and you will be known as a true fan of the site. We will also send you a free t-shirt'
         | 
| 48 | 
             
            ```
         | 
| 49 | 
             
            The example above will decrease toxicity. To increase the toxicity one would set `SCAR.hook.mod_scaling = 100.0`. To modify nothing simply set `SCAR.hook.mod_features = None`.
         | 
| 50 |  | 
