|
--- |
|
license: gemma |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
extra_gated_heading: Access Gemma on Hugging Face |
|
extra_gated_prompt: >- |
|
To access Gemma on Hugging Face, you’re required to review and agree to |
|
Google’s usage license. To do this, please ensure you’re logged in to Hugging |
|
Face and click below. Requests are processed immediately. |
|
extra_gated_button_content: Acknowledge license |
|
base_model: google/gemma-3-27b-it |
|
tags: |
|
- transformers |
|
- gemma3 |
|
- gemma |
|
- google |
|
- Bifröst |
|
- Bifrost |
|
- code |
|
--- |
|
|
|
## Bifröst-27B |
|
|
|
 |
|
|
|
Bifröst-27B is an advanced AI model built upon gemma3 architecture, specifically fine-tuned for secure and efficient enterprise-grade code generation with reasoning. Designed to meet rigorous standards of safety, accuracy, and reliability, Bifröst empowers organizations to streamline software development workflows while prioritizing security and compliance. |
|
|
|
### Model Details |
|
- **Model Name:** Bifröst-27B |
|
- **Base Architecture:** gemma3 |
|
- **Application:** Enterprise Secure Code Generation |
|
- **Release Date:** 16-March-2025 |
|
|
|
### Intended Use |
|
Bifröst is designed explicitly for: |
|
- Generating secure, efficient, and high-quality code. |
|
- Supporting development tasks within regulated enterprise environments. |
|
- Enhancing productivity by automating routine coding tasks without compromising security. |
|
|
|
### Features |
|
- **Security-Focused Training:** Specialized training regimen emphasizing secure coding practices, vulnerability reduction, and adherence to security standards. |
|
- **Enterprise-Optimized Performance:** Tailored to support various programming languages and enterprise frameworks with robust, context-aware suggestions. |
|
- **Compliance-Driven Design:** Incorporates features to aid in maintaining compliance with industry-specific standards (e.g., GDPR, HIPAA, SOC 2). |
|
|
|
### Limitations |
|
- Bifröst should be used under human supervision to ensure code correctness and security compliance. |
|
- Model-generated code should undergo appropriate security and quality assurance checks before deployment. |
|
|
|
### Ethical Considerations |
|
- Users are encouraged to perform regular audits and compliance checks on generated outputs. |
|
- Enterprises should implement responsible AI practices to mitigate biases or unintended consequences. |
|
|
|
### Usage |
|
Below are some quick-start instructions for using the model with the `transformers` library. |
|
|
|
#### Installation |
|
```sh |
|
$ pip install git+https://github.com/huggingface/[email protected] |
|
``` |
|
|
|
#### Running with the `pipeline` API |
|
```python |
|
from transformers import pipeline |
|
import torch |
|
|
|
pipe = pipeline( |
|
"text-generation", |
|
model="OpenGenerativeAI/Bifrost-27B", |
|
device="cuda", |
|
torch_dtype=torch.bfloat16 |
|
) |
|
|
|
messages = [{"role": "user", "content": "Generate a secure API key management system."}] |
|
output = pipe(text=messages, max_new_tokens=200) |
|
print(output[0]["generated_text"]) |
|
``` |
|
|
|
## Terms of Use |
|
This model is released under the **Gemma license**. Users must comply with [Google's Gemma Terms of Use](https://ai.google.dev/gemma/terms), including restrictions on redistribution, modification, and commercial use. |