JustJaro commited on
Commit
8ae4951
·
verified ·
1 Parent(s): da685d4

Fix README.md again

Browse files
Files changed (1) hide show
  1. README.md +6 -30
README.md CHANGED
@@ -338,43 +338,23 @@ uv sync
338
  ```
339
 
340
  </details>
341
- """
342
-
343
- script_section = f"""<details>
344
- <summary><strong>Quantization Script</strong></summary>
345
-
346
- Below is the exact `quantize.py` script used to generate this model:
347
-
348
- ```python
349
- {script_content}
350
- ```
351
-
352
- </details>
353
- """
354
-
355
- performance_section = f"""<details>
356
  <summary><strong>Quantization Performance</strong></summary>
357
 
358
  **Average perplexity (PPL) on wikitext-2-raw-v1 dataset:** 7.89 on on wikitext-2-raw-v1 dataset
 
359
 
360
  </details>
361
- """
362
-
363
- disclaimer_section = """<details>
364
  <summary><strong>Disclaimer</strong></summary>
365
 
366
- This model is for research purposes only. It may inherit limitations and biases from the original model and the quantization process. Please use responsibly and refer to the original model card for more details.
367
-
368
  </details>
369
- """
370
-
371
- contact_section = """<details>
372
  <summary><strong>Contact</strong></summary>
373
 
374
  For any questions or support, please visit [ConfidentialMind](https://www.confidentialmind.com) or contact us directly.
375
 
376
  [![LinkedIn](https://img.shields.io/badge/LinkedIn-ConfidentialMind-blue)](https://www.linkedin.com/company/confidentialmind/)
377
-
378
  </details>
379
  """
380
 
@@ -386,18 +366,14 @@ This model inherits the license from the original model. Please refer to the ori
386
  Original model card: `{source_model}`
387
 
388
  </details>
389
- """
390
-
391
- author_section = """<details>
392
  <summary><strong>Author</strong></summary>
393
 
394
  This model was quantized by [![LinkedIn](https://img.shields.io/badge/LinkedIn-Jaro-blue)](https://www.linkedin.com/in/jaroai/)
 
395
 
396
  </details>
397
- """
398
-
399
- ack_section = """<details>
400
  <summary><strong>Acknowledgements</strong></summary>
 
401
 
402
  Quantization performed using the GPTQModel pipeline and a big thanks to NeuralMagic for creating the calibration dataset, as well as the models original creators and/or fine-tuners.
403
 
 
338
  ```
339
 
340
  </details>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
341
  <summary><strong>Quantization Performance</strong></summary>
342
 
343
  **Average perplexity (PPL) on wikitext-2-raw-v1 dataset:** 7.89 on on wikitext-2-raw-v1 dataset
344
+ <details>
345
 
346
  </details>
 
 
 
347
  <summary><strong>Disclaimer</strong></summary>
348
 
349
+ This model is for research purposes only. It may inherit limitations and biases from the original model and the quantization process. Please use responsibly and refer to the original model card for more details.
350
+ <details>
351
  </details>
 
 
 
352
  <summary><strong>Contact</strong></summary>
353
 
354
  For any questions or support, please visit [ConfidentialMind](https://www.confidentialmind.com) or contact us directly.
355
 
356
  [![LinkedIn](https://img.shields.io/badge/LinkedIn-ConfidentialMind-blue)](https://www.linkedin.com/company/confidentialmind/)
357
+ <details>
358
  </details>
359
  """
360
 
 
366
  Original model card: `{source_model}`
367
 
368
  </details>
 
 
 
369
  <summary><strong>Author</strong></summary>
370
 
371
  This model was quantized by [![LinkedIn](https://img.shields.io/badge/LinkedIn-Jaro-blue)](https://www.linkedin.com/in/jaroai/)
372
+ <details>
373
 
374
  </details>
 
 
 
375
  <summary><strong>Acknowledgements</strong></summary>
376
+ <details>
377
 
378
  Quantization performed using the GPTQModel pipeline and a big thanks to NeuralMagic for creating the calibration dataset, as well as the models original creators and/or fine-tuners.
379