Comment by jowea
6 days ago
The big distinction is that the cheaper AIs will crowd out the humans out of the market, so mass market commercial art will be made by AIs if it is possible to produce that art. But some people will still want non-AI art, which I believe will be focused on less commercial focused art sectors.
> Music, for example, is an incredibly commercialized art. Replacing every song or album I have ever purchased with AI generated facsimiles is also an incredibly depressing thought.
And just to be clear, I'm not saying you're wrong.
> I would hope people still find value in painting, especially in a world with photography.
Sure, people do, but it is now a hobby for some and high art for a smaller number of professional painters, but the market willing to sustain a large number of professional painters doing portraits is gone.
> That is even ignoring the strained nature of this analogy. The context of the original quote was in a discussion of the inherent plagiarism of AI. Photography wasn't invited by stealing painters work.
I think the analogy is relevant because I am discussing the plagiarism of AI in relation to the economic aspects of copyright infringement and the impacts on the market for artists and SW devs. Not in relation to the moral rights[1] of authors. The issue of artists being annoyed on principle, not on economic effects, that some souless computer is producing plagiarist art that imitates their artstyle without attribution is a separate but closely related issue. I'm not sure but I think the article is more concerned with the former issue.
>I think the analogy is relevant because I am discussing the plagiarism of AI in relation to the economic aspects of copyright infringement and the impacts on the market for artists and SW devs. Not in relation to the moral rights[1] of authors. The issue of artists being annoyed on principle, not on economic effects, that some souless computer is producing plagiarist art that imitates their artstyle without attribution is a separate but closely related issue. I'm not sure but I think the article is more concerned with the former issue.
How can you justify separating the two concerns? This article is a defense of AI against its critics. It is a pretty poor defense if the argument is along the lines of "certain ethical concerns don't count". The author being "more concerned with" one issue doesn't make the other issue invalid or irrelevant.