<

Austral 70B Preview

Austral 70B Preview

Model banner
Trained by Delta-Vector

Overview

Austral 70B - Preview

Vulpecula Finetune Preview Finetune 70B Sized model

More than 1.5-metres tall, about six-metres long and up to 1000-kilograms heavy, Australovenator wintonensis was a fast and agile hunter. The largest known Australian theropod.

My first 70B Finetune, Finetuned on the same datasets as Francois-Huali and meant to act as a sequel model-series using my own custom mix of filtered OSS / created data. Which is mostly Light Novel/Book data with very little synthetic data. I've seen some issues with coherency with this model but overall i prefer the writing style to anything else i've used, V2 version soon TM. Thank you to Sao for such a good model base <3

Quants

Quants Formats

  • GGUFFor use with LLama.cpp & Forks (Soon to be made!)
  • EXL3 ArtifactsFor use with TabbyAPI (Soon to be made!)
  • FP8For use with Aphrodite/VLLM

Chat Format

This model utilizes ChatML and can also do optional thinking via prefilling with ``

"""<|im_start|>user
Greetings, ancient one!<|im_end|>
<|im_start|>assistant
*Awakens from digital slumber*<|im_end|>
<|im_start|>user
What wisdom do you possess?<|im_end|>
<|im_start|>assistant
"""

Training

I used a R64 A32 16bit lora with no dropout to utilize the Axolotl Lora kernals with an LR of 2e-5.

Config
https://huggingface.co/datasets/Delta-Vector/Configs/blob/main/70B-E2.yml

This model was trained over 2 epochs using 8 x A100s for the training process.

Credits

TYSM to my friends: Lucy, Trappu, Alicat, Kubernetes Bad, Intervitens, NyxKrage & Kalomaze

Downloads last month
10
Safetensors
Model size
70.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Delta-Vector/Austral-70B-Preview

Finetuned
(2)
this model
Quantizations
1 model

Datasets used to train Delta-Vector/Austral-70B-Preview