I think the core of the fair use argument is that the AI models that are being trained are transformative products of the original works.
Might be a hot take here but I basically agree. I still believe it was theft and that the realities of the legal framework we had don’t really stand up to the evolving problems, but under the current laws there is really no justification for saying that, taking the input of a bunch of images and giving the output of a set of statistical correlations of pixels based on descriptions, isn’t transformation.
I agree. I think that all AI companies need to crash and burn. But it’d be disingenuous to claim that what these models are doing is completely different than what humans are doing. Humans don’t pull stuff out of thin air. We are products of our upbringing and schooling. I say that, because I hate our current copyright laws with a burning passion and have done so long before LLMs showed up. It’s possible to hate copyright and AI companies.
I think the core of the fair use argument is that the AI models that are being trained are transformative products of the original works.
Might be a hot take here but I basically agree. I still believe it was theft and that the realities of the legal framework we had don’t really stand up to the evolving problems, but under the current laws there is really no justification for saying that, taking the input of a bunch of images and giving the output of a set of statistical correlations of pixels based on descriptions, isn’t transformation.
I agree. I think that all AI companies need to crash and burn. But it’d be disingenuous to claim that what these models are doing is completely different than what humans are doing. Humans don’t pull stuff out of thin air. We are products of our upbringing and schooling. I say that, because I hate our current copyright laws with a burning passion and have done so long before LLMs showed up. It’s possible to hate copyright and AI companies.