Follow musicians and bands that perform live would be my choice. If they write their music with AI and I still like it then that's ok by me. Obviously this doesn't scale if you are a platform operator but that's not my situation.
music can be performed live and in person. many musicians work with other musicians, labels, studios etc. a web of trust can be built and verification via performance is a compelling option. not complete but it's certainly an option.
would you as a label sign an artist you'd never seen perform? maybe there is value in a platform working under similar constraints.
I don't grant it. if you mean it is impossible to tell from the music itself, perhaps. but there are other means of verification.
A human can still upload tons of AI generated music though?
I don’t see how verifying that the author is a human helps in any way.
I also don’t think it’s a big problem but that’s another discussion
sounds like you don't really care about this honestly, so i'll reflect your apathy
1 reply →
What are any reasonable examples of how you can verify a song wasn't AI-generated?
e.g. Game speedrunners film the whole process to prove they did it themselves.
Presumably you had some ideas when you envisioned "human-verified platforms".
Follow musicians and bands that perform live would be my choice. If they write their music with AI and I still like it then that's ok by me. Obviously this doesn't scale if you are a platform operator but that's not my situation.
music can be performed live and in person. many musicians work with other musicians, labels, studios etc. a web of trust can be built and verification via performance is a compelling option. not complete but it's certainly an option.
would you as a label sign an artist you'd never seen perform? maybe there is value in a platform working under similar constraints.
3 replies →