License incompatibility
Hi,I'd like to report a License Conflict in watt-ai/watt-tool-70B model
. I noticed that this model was fine-tuned from meta-llama/Llama-3.3-70B-Instruct
, but it's currently published under the Apache-2.0 license. After taking a look at the LLaMA 3.3 Community License, especially the parts around output usage, legal compliance, and naming requirements. So there can be a bit of a mismatch there. It can be a little bit confusing what the rules are for people to follow when they use or share the model when the licenses are different.
⚠️ Key violations of LLaMA 3.3 Community License:
Clause 1.b.i – Redistribution and Use:
• ⚠️ No license file included (should contain the LLaMA 3.3 Community License)
• ⚠️ "Built with LLaMA" is not prominently displayed
• ⚠️ Model name does not begin with “Llama”, which is required for any derivative
Clause 1.b.iii – Required Notice:
• ⚠️ Missing the following required text in a "NOTICE" file:
“Llama 3.3 is licensed under the Llama 3.3 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved.”
Clause 1.iv – Acceptable Use Policy:
• ⚠️ No mention of Meta’s Acceptable Use Policy, which must be passed on to downstream users
Clause 2 – Additional Commercial Terms:
• ⚠️ No clarification about the 700M MAU (monthly active users) threshold — making commercial usage ambiguous
On the flip side, Apache-2.0 lets you:
• Use it commercially without asking for extra permission
• Sublicense and redistribute it under more flexible terms
• You don’t have to pass along any non-permissive terms or use restrictions from upstream
This creates a bit of a conflict because the LLaMA 3.3 license specifically says you can’t sublicense it under more flexible terms and requires downstream users to follow certain use restrictions, which Apache-2.0 doesn’t enforce.
So I'm thinking there might be a licensing conflict here that needs to be sorted out.
🔹 Suggestion:
1. To make sure everything aligns with the LLaMA 3.3 terms, you might want to tweak the licensing setup a bit, like:
• Maybe include a copy of the LLaMA 3.3 Community License in the repo or model card
• Include this notice in a “NOTICE” file or the docs:
> “Llama 3.3 is licensed under the LLaMA 3.3 Community License, Copyright © Meta Platforms, Inc.”
• If it makes sense, consider renaming the model to start with “LLaMA”
• A “Built with LLaMA” note somewhere in the model card could be helpful too
• Maybe a quick note about usage restrictions, especially for folks using it in commercial settings
• A statement clarifying that use of the model must comply with Meta’s Acceptable Use Policy
2. Or, we could just drop the Apache-2.0 tag and go with the LLaMA 3.3 Community License. This could clear up any confusion about redistribution rights and how people can use it downstream.
Hope this helps! 😊 Let me know if you have any questions or need more info.
Thanks for your attention!