LLaMA 2 and Open Source

by VanL

Meta recently released the LLaMA 2 language model. In several places they said it was "open source." It's not. But it has a fairly permissive commercial license that is driving a lot of interest, including among OSPOs.

The LLaMA License

LLaMA has been big news because the technology is pretty good and the license is very liberal. Three primary restrictions:

1. Companies that have fewer than 700 million monthly active users can use the model without paying Meta. Any company larger than that (e.g. Amazon, Google, ByteDance, Twitter/X) must get a commercial license.

2. You cannot use LLaMA outputs to train any non-LLaMA-based model.

3. All use must conform to Meta's use policy. The policy forbids certain types of outputs and malicious uses.

That's it. It opens the door for a lot of LLaMA-based startups (and LLaMA use by many established companies).

I think the LLaMA mania will be temporary, though, because of its license. The LLaMA license is still a commercial license and the terms will change. Right now you should think of LLaMA as a loss leader, and expect that the license terms will become more tilted toward Meta over time.

Even if the current license terms are kept as-is – which I doubt – if someone is below the cap currently (say 500M MAU), but then LLaMA 2.1 is released in six months and they are over the cap, then their ability to use the model will abruptly and completely terminate.

In contrast, a true open source license provides longer-term guarantees that make it a more stable base for commercial development. As OSPOs know, many companies start using open source because it is free, just like LLaMA is free for many users today. But over the long term, independence from vendor control usually overtakes cost as the primary consideration. That is something the LLaMA license cannot provide.

Meta has a hit because LLaMA is in a sweet spot in terms of accessibility and how generous its license terms are. However, the field is still wide open, and a genuine open source release of comparable quality will see massive adoption.