Upload README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -61,15 +61,8 @@ Multiple GPTQ parameter permutations are provided; see Provided Files below for | |
| 61 | 
             
            ```
         | 
| 62 |  | 
| 63 | 
             
            <!-- prompt-template end -->
         | 
| 64 | 
            -
            <!-- licensing start -->
         | 
| 65 | 
            -
            ## Licensing
         | 
| 66 |  | 
| 67 | 
            -
            The creator of the source model has listed its license as `other`, and this quantization has therefore used that same license.
         | 
| 68 |  | 
| 69 | 
            -
            As this model is based on Llama 2, it is also subject to the Meta Llama 2 license terms, and the license files for that are additionally included. It should therefore be considered as being claimed to be licensed under both licenses. I contacted Hugging Face for clarification on dual licensing but they do not yet have an official position. Should this change, or should Meta provide any feedback on this situation, I will update this section accordingly.
         | 
| 70 | 
            -
             | 
| 71 | 
            -
            In the meantime, any questions regarding licensing, and in particular how these two licenses might interact, should be directed to the original model repository: [Tim Dettmers' Guanaco 7B](https://huggingface.co/timdettmers/guanaco-7b).
         | 
| 72 | 
            -
            <!-- licensing end -->
         | 
| 73 | 
             
            <!-- README_GPTQ.md-provided-files start -->
         | 
| 74 | 
             
            ## Provided files and GPTQ parameters
         | 
| 75 |  | 
| @@ -257,56 +250,4 @@ And thank you again to a16z for their generous grant. | |
| 257 |  | 
| 258 | 
             
            # Original model card: Tim Dettmers' Guanaco 7B
         | 
| 259 |  | 
| 260 | 
            -
             | 
| 261 | 
            -
            <div style="width: 100%;">
         | 
| 262 | 
            -
                <img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
         | 
| 263 | 
            -
            </div>
         | 
| 264 | 
            -
            <div style="display: flex; justify-content: space-between; width: 100%;">
         | 
| 265 | 
            -
                <div style="display: flex; flex-direction: column; align-items: flex-start;">
         | 
| 266 | 
            -
                    <p><a href="https://discord.gg/Jq4vkcDakD">Chat & support: my new Discord server</a></p>
         | 
| 267 | 
            -
                </div>
         | 
| 268 | 
            -
                <div style="display: flex; flex-direction: column; align-items: flex-end;">
         | 
| 269 | 
            -
                    <p><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
         | 
| 270 | 
            -
                </div>
         | 
| 271 | 
            -
            </div>
         | 
| 272 | 
            -
            <!-- header end -->
         | 
| 273 | 
            -
             | 
| 274 | 
            -
            # Tim Dettmers' Guanaco 7B fp16 HF
         | 
| 275 | 
            -
             | 
| 276 | 
            -
            These files are fp16 HF model files for [Tim Dettmers' Guanaco 7B](https://huggingface.co/timdettmers/guanaco-7b).
         | 
| 277 | 
            -
             | 
| 278 | 
            -
            It is the result of merging the LoRA then saving in HF fp16 format.
         | 
| 279 | 
            -
             | 
| 280 | 
            -
            ## Other repositories available
         | 
| 281 | 
            -
             | 
| 282 | 
            -
            * [4-bit GPTQ models for GPU inference](https://huggingface.co/TheBloke/guanaco-7B-GPTQ)
         | 
| 283 | 
            -
            * [4-bit, 5-bit and 8-bit GGML models for CPU(+GPU) inference](https://huggingface.co/TheBloke/guanaco-7B-GGML)
         | 
| 284 | 
            -
            * [Merged, unquantised fp16 model in HF format](https://huggingface.co/TheBloke/guanaco-7B-HF)
         | 
| 285 | 
            -
             | 
| 286 | 
            -
            <!-- footer start -->
         | 
| 287 | 
            -
            ## Discord
         | 
| 288 | 
            -
             | 
| 289 | 
            -
            For further support, and discussions on these models and AI in general, join us at:
         | 
| 290 | 
            -
             | 
| 291 | 
            -
            [TheBloke AI's Discord server](https://discord.gg/Jq4vkcDakD)
         | 
| 292 | 
            -
             | 
| 293 | 
            -
            ## Thanks, and how to contribute.
         | 
| 294 | 
            -
             | 
| 295 | 
            -
            Thanks to the [chirper.ai](https://chirper.ai) team!
         | 
| 296 | 
            -
             | 
| 297 | 
            -
            I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
         | 
| 298 | 
            -
             | 
| 299 | 
            -
            If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
         | 
| 300 | 
            -
             | 
| 301 | 
            -
            Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
         | 
| 302 | 
            -
             | 
| 303 | 
            -
            * Patreon: https://patreon.com/TheBlokeAI
         | 
| 304 | 
            -
            * Ko-Fi: https://ko-fi.com/TheBlokeAI
         | 
| 305 | 
            -
             | 
| 306 | 
            -
            **Patreon special mentions**: Aemon Algiz, Dmitriy Samsonov, Nathan LeClaire, Trenton Dambrowitz, Mano Prime, David Flickinger, vamX, Nikolai Manek, senxiiz, Khalefa Al-Ahmad, Illia Dulskyi, Jonathan Leane, Talal Aujan, V. Lukas, Joseph William Delisle, Pyrater, Oscar Rangel, Lone Striker, Luke Pendergrass, Eugene Pentland, Sebastain Graf, Johann-Peter Hartman.
         | 
| 307 | 
            -
             | 
| 308 | 
            -
            Thank you to all my generous patrons and donaters!
         | 
| 309 | 
            -
            <!-- footer end -->
         | 
| 310 | 
            -
            # Original model card
         | 
| 311 | 
            -
             | 
| 312 | 
            -
            Not provided by original model creator.
         | 
|  | |
| 61 | 
             
            ```
         | 
| 62 |  | 
| 63 | 
             
            <!-- prompt-template end -->
         | 
|  | |
|  | |
| 64 |  | 
|  | |
| 65 |  | 
|  | |
|  | |
|  | |
|  | |
| 66 | 
             
            <!-- README_GPTQ.md-provided-files start -->
         | 
| 67 | 
             
            ## Provided files and GPTQ parameters
         | 
| 68 |  | 
|  | |
| 250 |  | 
| 251 | 
             
            # Original model card: Tim Dettmers' Guanaco 7B
         | 
| 252 |  | 
| 253 | 
            +
            No original model card was available.
         | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
