License...? I feel like I'm becoming the HF bridge troll
Do not assume community licenses are friendly, folks!!! Especially not in a time where you cannot even trust things called "open" in their name.
Hunyuan... why are you releasing all of these models with such a predatory license? What is even the point of trading blows with Qwen-Image if it's... completely open? Like... if we can't use this to improve other models... other than Hunyuan's, you REALLY expect this to... be worthwhile? Why? Trying to sell API requests? I just do not understand.
I've just laid out a little of the license here in case anyone is curious:
"THIS LICENSE AGREEMENT DOES NOT APPLY IN THE EUROPEAN UNION, UNITED KINGDOM AND SOUTH KOREA AND IS EXPRESSLY LIMITED TO THE TERRITORY, AS DEFINED BELOW.
l. “Territory” shall mean the worldwide territory, excluding the territory of the European Union, United Kingdom and South Korea."
[...]
"Your use of the Tencent Hunyuan Works must comply with applicable laws and regulations (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Tencent Hunyuan Works, which is hereby incorporated by reference into this Agreement. You must include the use restrictions referenced in these Sections 5(a) and 5(b) as an enforceable provision in any agreement (e.g., license agreement, terms of use, etc.) governing the use and/or distribution of Tencent Hunyuan Works and You must provide notice to subsequent users to whom You distribute that Tencent Hunyuan Works are subject to the use restrictions in these Sections 5(a) and 5(b)."
[...]
"b. You must not use the Tencent Hunyuan Works or any Output or results of the Tencent Hunyuan Works to improve any other AI model (other than Tencent Hunyuan or Model Derivatives thereof)."
"c. You must not use, reproduce, modify, distribute, or display the Tencent Hunyuan Works, Output or results of the Tencent Hunyuan Works outside the Territory. Any such use outside the Territory is unlicensed and unauthorized under this Agreement."
And the some of the other terms!:
You agree not to use Tencent Hunyuan or Model Derivatives:
Outside the Territory;
('Course.)To harm Yourself or others;
(Vague?)To repurpose or distribute output from Tencent Hunyuan or any Model Derivatives to harm Yourself or others;
(How are we going to be able to tell if random text or images online are outputs of a random model to even be able to try?!?!)To generate or facilitate false online engagement, including fake reviews and other means of fake online engagement;
(so... chatbots too, or...?)To generate or disseminate information (including images, code, posts, articles), and place the information in any public context (including –through the use of bot generated tweets), without expressly and conspicuously identifying that the information and/or content is machine generated;
(No posting anything online ever without disclosing no matter where or what you are doing, with no instructions on how to do this correctly despite their specific wording. WOOF.)To make high-stakes automated decisions in domains that affect an individual’s safety, rights or wellbeing (e.g., law enforcement, migration, medicine/health, management of critical infrastructure, safety components of products, essential services, credit, employment, housing, education, social scoring, or insurance);
(Don't use the model as a part of a system that is doing anything important for any reason. No level of risk is viable, and even if there is no risk, you may not use this responsibly in a workflow that DOES contain risk. Probably should just ban all medical models we got from Qwen 3 too if it's that dangerous, huh? We're too dumb to have them? WOOF x2)In a manner that violates or disrespects the social ethics and moral standards of other countries or regions;
(defined? I'm not sure we're... keeping a list? Neighbors cant even seem to agree about this? Are we talking legally?)To engage in the unauthorized or unlicensed practice of any profession including, but not limited to, financial, legal, medical/health, or other professional practices.
(So... everything ever. Right.)
It does not take a lawyer to notice that this license is spooky. Basically, though other companies are similar, this is unusually unapologetic about saying "If you use this or anything made by this in any way, and we want to destroy you? You're doomed."
Wolf in sheep's clothing. Literally a liability to use, even casually. AVOID.
Hunyuan... Please, change this. It's not a bad look to acknowledge you didn't realize, or had a lawyer give weird advice, or even to just not say anything and quietly change it like most. I made a little frown at the LLMs because I think they could do good in the world, but it was your translation model being closed that really bummed me out as a disability advocate. This image model on top is just too much.
The license does have some ambiguous wording, but calling it a "liability trap" is too extreme. The license doesn't need to list every possible situation. It is more of a disclaimer from the vendor to avoid responsibility for content generated by users that may be inappropriate.
Hi Wenaka! I don't actually completely disagree with you. I wish I could fully agree it was too extreme.
This is a legally binding document, and so to say that they don't need to list every single possible situation is... baffling. It's not about opinion, it's about literal legal liability. The problem is that if they are vague about what is included, and you sign the license (use the model at all), they can retroactively use that "grey area" where they have been vague to include things that would not reasonably fall under what they have written to most people. This is because we are AGREEING to the terms. I would actually argue that most of these restrictions straight up just insinuate every generation ever released by the model, and the model itself, for any reason whatsoever, can be penalized, and it's not really a huge stretch to say that and so I think they'd be very within their rights to take legal action for... literally almost anything I can think of. Unless you're in a bunker somewhere airgapped, but then how do you prove you aren't harming yourself or something? That's a weak argument in court and likely wouldn't ever happen (better things to nail em on in the same doc) but the fact that it's even possible at all... W H Y?
It is not about "common sense" or what they SHOULD have to do. It's not about what a reasonable person would read and think: huh yeah okay I wont do that, I get what this entails.
Most people do not read licenses. Point blank. If this was a liability waver, it would be a normal open source license as those wave all liability. See MIT:
"THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
No, this is "I have grounds to take legal action if I ever feel like it, prolly won't but... who knows! ;)".
I use AI for disability aid a lot and would not feel comfortable signing an end user up for that risk, as an example. Taking on the risk yourself is one thing, but then the larger implications of releasing a model like this is the actual problem.