Comment by simonw
13 hours ago
Just noticed this notice added at the top of the Blender announcement of their funding from Anthropic: https://www.blender.org/press/anthropic-joins-the-blender-de...
> Notice: This announcement is causing a lot of feedback. We are actively evaluating it.
Presumably a lot of Blender users work in roles that feel threatened by AI being used for computer graphics work.
Lots of negative replies on Blursky here: https://bsky.app/profile/blender.org/post/3mkkuyq3ijs2q
I don't really get the backlash about Blender here, this isn't generative art, it's basically a natural language means of scripting blender.
This feels like the proper way to have AI act as a tool to make artist's jobs easier without taking away their creativity?
Edit: I guess they might want absolutely no AI of any sort in their tools (which seems like a strange line to draw), or is it about the data it's been trained on?
It's really clear that businesses are hoping to replace people with AI. In an industry that is already very difficult to make a stable living in, and troubled with regular plagiarism, is it really that surprising that any encroachment of AI into that space would be met with backlash?
Even if you can see how individual circumstances could be beneficial to your workflow, it's a general direction I think many take issue with quite fairly.
> It's really clear that businesses are hoping to replace people with AI. In an industry that is already very difficult to make a stable living in, and troubled with regular plagiarism, is it really that surprising that any encroachment of AI into that space would be met with backlash?
But what's the plan, then? Prevent any third party from downloading Blender and integrate it in any way with an agent?
2 replies →
Businesses have already replaced several background artists gambling on the uncopyrightable status of "AI" output being ignored. In a comercial setting, one can't sell what they never owned in the first place.
Without a constant stream of stolen training data, the "AI" piracy bleed-through and isomorphic plagiarism business model is unsustainable.
We look forward to liquidating the GPU data-centers at a heavy discount. =3
4 replies →
Regardless of the purported upside, many people in the arts feel betrayed by the commercial interests that built this technology on their work without their consent and threatened by the explicit intent of these vendors to devalue their work by saturating the art and design market with cheap automated substitution.
A lot of artists who would love to be able to direct their professional software in natural language have to reconcile that with how this technology came to be and what the aims are of the company now delivering it to them.
I spent most of my career in the open source world and doesn’t bother me models are trained on my output. Should I feel differently? It seems there’s a kind of ego or emotional attachment to the output that is more common among artists than devs? Perhaps abundance vs scarcity mindsets?
11 replies →
Speaking as someone who works in the industry, I haven't really heard this sentiment. Artists are predominantly hostile to diffusion models, but optimistic about LLMs and their ability to help them write tools and scripts even if they're non-technical.
1 reply →
Yeah, I can understand being upset with their work being stolen to train these models. Anthropic doesn't seem to be working on image/video generation, but they are still training on text-based creative works of questionable sourcing.
Makes me think that there's some room in the model lineup for one that doesn't do as well on benchmarks, but is trained on "ethically sourced" data (though they'd need to somehow prove that they aren't "accidentally" including other data).
1 reply →
The funny thing is that Allegorithmic (now part of Adobe) was far more devastating to certain classes of game artist than stuff like this will be in its current form.
It almost totally automated vast swaths of texture generation by creating algorithmic systems that technical artists could use to create textures.
Want a brick texture? Sure, you connect some nodes and set parameters and you have great looking bricks. Want the mortar to be a little more widely spaced? Done. Want some moss on the brick? Want some chipping on the brick? Want some color variation? Done, done, done.
It probably reduced the amount of time to iterate textures by more than 100x.
Now, talented technical artists make OK money because they are good at using these tools. Photoshop jockies are gone.
LLM manipulation of Blender will be interesting but it's very, very challenging to see the path of something like Claude having nearly as big of an impact. It'll be helpful to automate some common tasks, to build internal tooling. But Allegorithmic single handedly changed the way 3D games look, because you could be so much more ambitious.
You didn't really hear about it, though, because it wasn't part of the cultural zeitgeist.
People who built a career on their mastery of Blender are going to lose their livelihoods. Why is this difficult to understand?
> Why is this difficult to understand?
It'll be way easier to understand for developers when it starts happening in earnest to our profession, which is coming soon.
It already is here to some extent, but so far mostly on the junior end so it hasn't been impacting many people who are already established in an industry that has provided relatively easy stable livelihoods for the past 30+ years, but soon won't.
1 reply →
I think it's mainly anti-AI sentiment in general.
I am a huge beneficiary of agentic dev tools. They completely changed my life and my income. However, I totally get the general anti-AI sentiment. The ultra-bear case is that it somehow kills all of us, the bull case is that those who own the inference get the all the spoils.
Even myself, while I am currently extremely empowered by these tools... I could see my role (Founder/PM/builder) disappearing in the next couple years.
I respect you a lot, so if you have a moment, I would really like to get talked down from my take.
2 replies →
People are guzzling the amygdala control juice these days
Say that again in five years when you can't find a job except mega yatch toilet cleaner because Claude is distinguished engineer level for one millionth of your cost and thousands of times faster, and can be instantly parallelized in the tens or hundreds of thousands just to be spun down arbitrarily as needed at any time
6 replies →
It's not artist replacement yet because they dont have the necessary training or sophistication.
I doubt the current state shows the end of their ambitions.
There is no acceptable use of AI for most people in the artistic field. They see it as an extreme treason, and I understand. They're under incredible incredible threat.
They are conscious of preventing momentum in a bad direction.
If they don't fight it hyper hard, a huge fraction of them will be out of a job instantly.
That's a strange position to take. I can understand not wanting models that have been trained on questionably sourced data, but otherwise they're opposing essentially a UX change, not based on UX concerns but on ideological fears.
Given how much software and other AI/computer vision improvements 3D content often relies on, it's weird to decide that the algorithm itself is unallowable.
7 replies →
This is the best phrasing of the issue I've seen online anywhere.
You can find AI useful and still be against its introduction into your field for entirely understandable reasons.
Unfortunately this does create uphill friction for any good-intentioned people trying to use AI to improve art by empowering people to take on more ambitious projects. (This is a general statement and not related to the case of Anthropic. Of course Anthropic here is just trying to sell their product, which is a fair thing to do in isolation, but I also understand the opposition to it on the grounds of its downstream effects.)
Completely false and I hate this puritan gatekeeping. Artists who hate AI are the type to put more importance on the craft than the end product itself. Art is a means of communicating something personal. It’s not meant to show off skills in how well you can move a pencil or how many fricking tools you know in adobe.
AI removes all these hurdles and directly presents you with the end problem - communication. Artists hate that because most artists don’t have anything to communicate. These people deserve to be automated away. I don’t wanna see more derivative shit. Artists who have something special to communicate won’t feel threatened by AI but feel more freedom.
1 reply →
[dead]
> Lots of negative replies on Blursky
To the surprise of no one.
I really do want to support artists, but I also feel super conflicted about what is actually at stake here if an AI agent generates a scene for me. I never would have hired a 3D artist before this moment, because there's no reason for me to. However, if I can easily poop out a 3D rendering of something custom without much time or cost, I would absolutely love to do that. How many one-off presentations or project design sessions I could have with cheap throwaway 3D artwork that provides value to explain my thought process?!
Just like AI image slop and AI book slop prove though, I highly doubt whatever Claude and Blender are cooking up will ever come close to taking a prompt like
> render a scene of a corgi sitting on a chair looking out of a window at 3 cats playing with the corgi's favorite toy.
and turning that into anything useful.
[flagged]
Bluesky also has a community of AI tool developers that are more sane. Occasionally a post escapes containment.
People on Mastodon are losing their shit too[1].
I understand being unhappy about something but people gotta relax.
---
[1]: https://social.coop/@netopwibby/116483037092383210
Why do they "gotta relax"? Are they making you uncomfortable by voicing their opinions or why exactly?