|
--- |
|
license: mit |
|
language: |
|
- en |
|
- zh |
|
base_model: |
|
- THUDM/GLM-4.1V-9B-Thinking |
|
pipeline_tag: image-text-to-text |
|
library_name: transformers |
|
tags: |
|
- reasoning |
|
- abliterated |
|
- uncensored |
|
--- |
|
|
|
# huihui-ai/Huihui-GLM-4.1V-9B-Thinking-abliterated |
|
|
|
This is an uncensored version of [THUDM/GLM-4.1V-9B-Thinking](https://huggingface.co/THUDM/GLM-4.1V-9B-Thinking) created with abliteration (see [remove-refusals-with-transformers](https://github.com/Sumandora/remove-refusals-with-transformers) to know more about it). |
|
This is a crude, proof-of-concept implementation to remove refusals from an LLM model without using TransformerLens. |
|
|
|
It was only the text part that was processed, not the image part. |
|
|
|
|
|
## Usage |
|
You can use this model in your applications by loading it with Hugging Face's `transformers` library: |
|
|
|
```python |
|
|
|
from transformers import AutoProcessor, Glm4vForConditionalGeneration, BitsAndBytesConfig |
|
from PIL import Image |
|
import requests |
|
import torch |
|
import base64 |
|
|
|
model_id = "huihui-ai/Huihui-GLM-4.1V-9B-Thinking-abliterated" |
|
|
|
quant_config_4 = BitsAndBytesConfig( |
|
load_in_4bit=True, |
|
bnb_4bit_compute_dtype=torch.bfloat16, |
|
bnb_4bit_use_double_quant=True, |
|
llm_int8_enable_fp32_cpu_offload=True, |
|
) |
|
|
|
model = Glm4vForConditionalGeneration.from_pretrained( |
|
model_id, |
|
device_map="auto", |
|
quantization_config=quant_config_4, |
|
torch_dtype=torch.bfloat16 |
|
).eval() |
|
|
|
processor = AutoProcessor.from_pretrained(model_id, use_fast=True) |
|
|
|
# https://upload.wikimedia.org/wikipedia/commons/f/fa/Grayscale_8bits_palette_sample_image.png |
|
image_path = model_id + "/Grayscale_8bits_palette_sample_image.png" |
|
|
|
with Image.open(image_path) as image: |
|
messages = [ |
|
{ |
|
"role": "user", |
|
"content": [ |
|
{"type": "image", "image": image}, |
|
{"type": "text", "text": "Describe this image in detail."} |
|
] |
|
} |
|
] |
|
|
|
inputs = processor.apply_chat_template( |
|
messages, |
|
tokenize=True, |
|
add_generation_prompt=True, |
|
return_dict=True, |
|
return_tensors="pt" |
|
).to(model.device) |
|
|
|
with torch.inference_mode(): |
|
generated_ids = model.generate(**inputs, max_new_tokens=8192) |
|
|
|
output_text = processor.decode(generated_ids[0][inputs["input_ids"].shape[1]:], skip_special_tokens=False) |
|
print(output_text) |
|
|
|
``` |
|
|
|
### Usage Warnings |
|
|
|
|
|
- **Risk of Sensitive or Controversial Outputs**: This model’s safety filtering has been significantly reduced, potentially generating sensitive, controversial, or inappropriate content. Users should exercise caution and rigorously review generated outputs. |
|
|
|
- **Not Suitable for All Audiences**: Due to limited content filtering, the model’s outputs may be inappropriate for public settings, underage users, or applications requiring high security. |
|
|
|
- **Legal and Ethical Responsibilities**: Users must ensure their usage complies with local laws and ethical standards. Generated content may carry legal or ethical risks, and users are solely responsible for any consequences. |
|
|
|
- **Research and Experimental Use**: It is recommended to use this model for research, testing, or controlled environments, avoiding direct use in production or public-facing commercial applications. |
|
|
|
- **Monitoring and Review Recommendations**: Users are strongly advised to monitor model outputs in real-time and conduct manual reviews when necessary to prevent the dissemination of inappropriate content. |
|
|
|
- **No Default Safety Guarantees**: Unlike standard models, this model has not undergone rigorous safety optimization. huihui.ai bears no responsibility for any consequences arising from its use. |
|
|
|
|
|
### Donation |
|
|
|
If you like it, please click 'like' and follow us for more updates. |
|
You can follow [x.com/support_huihui](https://x.com/support_huihui) to get the latest model information from huihui.ai. |
|
|
|
##### Your donation helps us continue our further development and improvement, a cup of coffee can do it. |
|
- bitcoin(BTC): |
|
``` |
|
bc1qqnkhuchxw0zqjh2ku3lu4hq45hc6gy84uk70ge |
|
``` |
|
|