Recent Posts

Pages: « 1 2 3 4 5 6 7 8 9 10 »
41
Bot Discussion Public / Re: SFWA’s Comments
« Last post by TimothyEllis on November 22, 2023, 11:28:17 PM »
The Fair Use defense lies in the fact that AI doesn't make copies of anything

Plagiarism lawsuits? How do you plagiarize if it's not stored somewhere to access?

Look at Quora. The answer bots generate the same answer over and over again, pretty well word for word. It has to be stored somewhere.

That claim doesn't hold water in reality.
42
Bot Discussion Public / Re: SFWA’s Comments
« Last post by PJ Post on November 22, 2023, 11:20:18 PM »
The Fair Use defense lies in the fact that AI doesn't make copies of anything - it's not competing on a one to one basis, which is why we have copyright in the first place - to protect IPs from counterfeits that might negatively affect the original's profitability or place in the market. This is why style and genre can't be copyrighted.

The reality is that copyright doesn't even really apply to this situation, at least not in its current form.

Also, copyright was never intended to prevent general competition, in fact, I think the intention was just the opposite.

___

Everyone thought functional AGI was so far down the road that only academic science types were even bothering with it. They were a bunch of well-paid nerdy folks, not ruthless hedge fund managers bent on world domination. So, yeah, I think their intentions are good.

The natural fallout from AI is up to the government to resolve. We're going to have a post labor society, which will turn everything upside down. We can't blame these technological visionaries for Washington's inability to get out of its own way. For example, AI isn't the one letting our infrastructure crumble away before our very eyes. Ultimately, we have to take responsibility for this mess, we elected them.
43
Bot Discussion Public / Re: SFWA’s Comments
« Last post by Bill Hiatt on November 22, 2023, 04:23:30 AM »
There are undeniably some uses of AI that could be transformative. But churning out "creative" products after being trained on creative products seems curiously untransformative.

In this respect, cases in which transformative use was found are instructive. Google won on a case that involved using book excerpts to beef up search results on related topics. The whole book is in the database, but only parts relevant to the search appear. No competing products are generated, and the purpose is clearly different.

Turnitin won on a case involving incorporating all submitted essays into a database. The essays are used to check new submissions for possible plagiarism. (In my experience, 50% of high school cheating involves students copying from other students. Turnitin's essay database also makes it harder for students to plagiarize from print sources--because they never know whether or not someone else plagiarized from the same source.) Anyway, the full text of the essays cannot be released without permission from the teacher to whose class the essay was submitted, and then only to examine the specifics of possible plagiarism. No competing products were generated, and again, the purpose is clearly different.
44
Bot Discussion Public / Re: SFWA’s Comments
« Last post by Post-Crisis D on November 22, 2023, 04:04:11 AM »
As for "Fair Use," U.S. Copyright law considers the following:

1) Purpose and Nature of the Use
Educational and non-commercial uses are more likely to be considered as "fair use" than commercial uses.  Transformative uses may also be considered fair use.  Commercial AI is likely to fail on the former.  As for that latter, is using the content of books to create more books really transformative?  Seems unlikely.

2) Nature of the Copyrighted Work
The use of creative works is less likely to be considered fair use than the use of factual works.  AI being "trained" on creative works to create simulated creative works is seems likely to fail here as well.

3) Amount of the Copyrighted Work Used
Well, they "trained" AI using whole books so, again, fail.

4) Effect of the Use on the Potential Market or Value of the Copyrighted Work
Since AI is being used to create books that would compete with the original works it was "trained" on, it would seem likely to fail here as well.  Additionally, AI has the potential to flood the market which would increase competition for the original works as well as reduce their commercial value.  That would also tend to make it fail the fair use test.

All four of those factors are weighed.  Even if you want to argue that AI is transformative, it still fails under the other three factors.  Now, sometimes courts have argued that using an entire work can be fair use, so even if AI passes that, it still fails on the remaining two.
45
Bot Discussion Public / Re: SFWA’s Comments
« Last post by Post-Crisis D on November 22, 2023, 03:49:50 AM »
I think the intent of these AI companies is generally good - they're trying to create a post-scarcity semi-utopian society, like Star Trek. Oh, and make boatloads of money in the interim.

If I remember right, didn't OpenAI start as an open source non-profit?  And then they made some advances and were like, screw that, let's make this a for-profit corporation!  Cha-Ching!  Cha-Ching!  Cha-Ching!  So, I have difficulty believing in any imagined altruism on their part.


You seriously believe this? How many decades is it going to take before we get that awesome fake Star Trek future? How many will suffer? We can't even reliably give people welfare, or medical care, or food now. And that's with jobs humans can do. Take away those jobs, and what do you really think is going to happen?

Yeah. Some of us may have jobs after this "AI" crap takes over. Very few, and it won't pay much. Thinking bespoke shoe makers is a relevant comparison. I can't even.

And people without jobs or hope for any kind of advancement in their life condition have a tendency to find things to do that may not bode well for society.


What most people forget is that Star Trek future followed a third world war that left hundred of millions dead.

And then didn't happen with a first contact situation first.

And being able to generate countless books and movies is not the same as having replicators that can provide basic needs such as food, water, shelter, tools, etc.
46
Bot Discussion Public / Re: SFWA’s Comments
« Last post by TimothyEllis on November 22, 2023, 02:09:42 AM »
Quote
I think the intent of these AI companies is generally good - they're trying to create a post-scarcity semi-utopian society, like Star Trek.

You seriously believe this? How many decades is it going to take before we get that awesome fake Star Trek future?

What most people forget is that Star Trek future followed a third world war that left hundred of millions dead.

And then didn't happen with a first contact situation first.

 :tap
47
Bot Discussion Public / Re: SFWA’s Comments
« Last post by She-la-te-da on November 22, 2023, 01:38:50 AM »
Quote
I think the intent of these AI companies is generally good - they're trying to create a post-scarcity semi-utopian society, like Star Trek.

You seriously believe this? How many decades is it going to take before we get that awesome fake Star Trek future? How many will suffer? We can't even reliably give people welfare, or medical care, or food now. And that's with jobs humans can do. Take away those jobs, and what do you really think is going to happen?

Yeah. Some of us may have jobs after this "AI" crap takes over. Very few, and it won't pay much. Thinking bespoke shoe makers is a relevant comparison. I can't even.
48
Bot Discussion Public / Re: SFWA’s Comments
« Last post by Bill Hiatt on November 22, 2023, 01:01:01 AM »
I'd like to believe in the semi-utopian society, but as I've said, I have a hard time seeing how that's the natural fruit of corporate greed.
49
Bot Discussion Public / Re: SFWA’s Comments
« Last post by PJ Post on November 21, 2023, 11:18:33 PM »
I think the intent of these AI companies is generally good - they're trying to create a post-scarcity semi-utopian society, like Star Trek. Oh, and make boatloads of money in the interim.

Personally, I'm emotionally conflicted on it being Fair Use, but I can make the argument for it fairly easily. It just cuts against the grain, you know?

In the end, I believe they will succeed, and this post-scarcity world will become possible, but what I'm less confident about is if the powers that be will allow it.
50
Bot Discussion Public / Re: SFWA’s Comments
« Last post by Bill Hiatt on November 21, 2023, 12:53:43 AM »
Quote
I remember the anecdotes, but not specific data, including the prompts that lead to the infringing text. AI doesn't "know" the text to be copied (it doesn't store anything), it only knows the likelihood of the next word in the series or the next pixel. The Getty logo in AI images only proved that Getty owns a lot of soccer images1. These artifact abstractions show up in a lot of AI images. One weird one was rulers showing up next to moles. This is because most images of moles are medical, and the datasets often show a ruler to document the change in size. That's a learned association.
I've said this before, but I don't really care exactly how AI works. Somehow, it is trained on copyrighted material with the intent of ultimately producing competing products. Yeah, courts might find that fair use, but that's by no means a done deal. I'd argue that's a ridiculously broad definition of fair use. The fact that many companies moved of their own accord toward consent and/or compensation models suggests that the very least that the business world isn't certain about the fair use issue.

And yes, AI doesn't know the text to be copied. That's precisely part of the problem. It also doesn't know the difference between fact and fiction, which is why it makes up fake legal cases in legal briefs and fake bios (including statements that could be considered defamation). It's the intent of the developers that matters here, because AI isn't capable of forming an intent.
Pages: « 1 2 3 4 5 6 7 8 9 10 »