view the rest of the comments
news
Welcome to c/news! We aim to foster a book-club type environment for discussion and critical analysis of the news. Our policy objectives are:
-
To learn about and discuss meaningful news, analysis and perspectives from around the world, with a focus on news outside the Anglosphere and beyond what is normally seen in corporate media (e.g. anti-imperialist, anti-Zionist, Marxist, Indigenous, LGBTQ, people of colour).
-
To encourage community members to contribute commentary and for others to thoughtfully engage with this material.
-
To support healthy and good faith discussion as comrades, sharpening our analytical skills and helping one another better understand geopolitics.
We ask community members to appreciate the uncertainty inherent in critical analysis of current events, the need to constantly learn, and take part in the community with humility. None of us are the One True Leftist, not even you, the reader.
Newcomm and Newsmega Rules:
The Hexbear Code of Conduct and Terms of Service apply here.
-
Link titles: Please use informative link titles. Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed.
-
Content warnings: Posts on the newscomm and top-level replies on the newsmega should use content warnings appropriately. Please be thoughtful about wording and triggers when describing awful things in post titles.
-
Fake news: No fake news posts ever, including April 1st. Deliberate fake news posting is a bannable offense. If you mistakenly post fake news the mod team may ask you to delete/modify the post or we may delete it ourselves.
-
Link sources: All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. If you are citing a Twitter post as news, please include the Xcancel.com (or another Nitter instance) or at least strip out identifier information from the twitter link. There is also a Firefox extension that can redirect Twitter links to a Nitter instance, such as Libredirect or archive them as you would any other reactionary source.
-
Archive sites: We highly encourage use of non-paywalled archive sites (i.e. archive.is, web.archive.org, ghostarchive.org) so that links are widely accessible to the community and so that reactionary sources don’t derive data/ad revenue from Hexbear users. If you see a link without an archive link, please archive it yourself and add it to the thread, ask the OP to fix it, or report to mods. Including text of articles in threads is welcome.
-
Low effort material: Avoid memes/jokes/shitposts in newscomm posts and top-level replies to the newsmega. This kind of content is OK in post replies and in newsmega sub-threads. We encourage the community to balance their contribution of low effort material with effort posts, links to real news/analysis, and meaningful engagement with material posted in the community.
-
American politics: Discussion and effort posts on the (potential) material impacts of American electoral politics is welcome, but the never-ending circus of American Politics© Brought to You by Mountain Dew™ is not welcome. This refers to polling, pundit reactions, electoral horse races, rumors of who might run, etc.
-
Electoralism: Please try to avoid struggle sessions about the value of voting/taking part in the electoral system in the West. c/electoralism is right over there.
-
AI Slop: Don't post AI generated content. Posts about AI race/chip wars/data centers are fine.
why? Because I made it on a computer? or because the code that the computer used was very complex? or because during some of the code uses data that is freely available on the internet?
because you didn't write the code for the algorithm, you didn't make any of the training data pictures, and you didn't do anything that could be considered 'creative' or 'talented' to make it. Real fucking artists that put hours of time, effort, and creativity into their work deserve to have it protective. By plugging in "looking at a sunset from a mountain" or some shit into stable diffusion doesn't make you entitled to the shit it puts out. terrible take.
Rubbish. You're just assuming the user put in little effort. It's perfectly possible to put in little effort using pen and paper too. The end result looks less like a final piece, but it's probably equally close to what the artist tried to express. No one who uses downloaded brushes in Photoshop write the code for importing and drawing with those brushes. Nobody who uses photo textures wrote the code for their cameras. Nobody who uses Blender wrote the code for the light transport that happens when you hit render.
Drawing a style guide, drawing the composition with a sketch, and paint overs are all completely normal parts of the process when using Stable Diffusion, and none of that is where the creativity comes in.
you're right, that was a bad argument
the problem is that the AI trains off of the data of unwilling artists without credit.
Did you invent the paint brush?
Working hard does not have any intrinsic moral value. That is puritanist
. There is no value in suffering.
you are right. i'm sorry. but the issue still stands that the programs that create the art use other artist's work for their own profit with no credit. these people are having their work just, stolen from them.
They are having their art used in a way they didn't expect. That is a problem in capitalism not a problem of tech.
agreed, but that doesn't make it any more ethical to partake in it.
Is it any less ethical than producing art when your art supplies are tainted by exploitation? When you are living on land stolen through genocide? when your way of life is built on the subjugation of the global south?
The fact is there is effort and creative input involved in making AI art no matter how miniscule that effort is. This ruling protects that effort and creative input from being used for profit by anyone who pleases. It isn't protecting AI tech. its protecting producers form exploitation and that is all.
i should also clarify that i am not defending IP, the opposite in fact. i am saying that someone who makes an AI image isn't entitled to IP on that image.
Because it is a composite of the art other people made
So is colage. Using other art in art is very common. Every song that samples another song isn't art?
A majority of the data that LIM train off is not even "art" they are images. They lack the context and emotive qualities that differentiate art from information.
If the collage is literally just using the constituent elements the same way they were originally used, yes, that is textbook plagiarism and I already explicitly made this comparison
Sampling would by convention be considered plagiarism, which is why "sampling culture" is a thing, because it exists within a different but also defined set of norms around what is or is not acceptable and this has its own ongoing controversies that I would suggest not flattening into "the hip-hop people say plagiarism isn't real", which is what your non-argument amounts to
But the AI isn't using the constituent elements in the same way they were originally used. they are being compared and merged with thousands of other versions of that element to make a new one.
The original use is "painting of a car", the new use is "painting of a car". It's using thousands of references in a composite, but the material is by definition not being used transformatively because that is the opposite of what the program is trying to accomplish with its data (i.e. matching visual patterns with descriptions)