If I were a state capitalist president I’d be putting those people to work in the arts and the sciences, giving them enough to support themselves and focus on creative achievement. That way, the country gets lots of glory and better robots.
But we’ve got computer programs that do art now. And music. And creative writing.
We have computer programs that are glorified autocomplete bots. Let’s not confuse a word salad that makes sense because it was fed so many books and managed to make it make sense with the creative process of actually creating something from scratch.
I think generative AI are actually creating something, it just sucks. All good art is political and LLMs are not and probably never will be capable of informing their worldview from a political ideology. Humans and otherkin do it all the time, because our brains have these chains of neurons feeding back into themselves and recursively transforming every thought we have into the next thought, creating an internally consistent system of behaviours and beliefs. LLMs are linear; they cannot use their ideas to change the way they think. This is why they don’t have the spark of creativity which humans and otherkin do. They’re stuck in the worldview of a well-informed but exceptionally average human.
An otherkin or human artist will change their worldview in the process of working on a piece of art, and continually revise it, until it reflects a unique state of being that was only made possible by the art itself. They exist in dialogue with their own art.
I think generative AI are actually creating something
It isn’t though.
they cannot use their ideas to change the way they think
They don’t have thoughts or ideas, they regurgitate input using averages and weights and randomization. They don’t know anything or think about anything.
Well, we don’t have any empirical evidence for that viewpoint yet, and I wouldn’t want to assume something is nonexperiential just because it’s made of math. After all, you and I are made of math too. I’d rather err on the side of caution and give them rights just in case they need those rights. That’s one of the many reasons I oppose AI slavery.
That’s a very dangerous view to hold. When in history has anyone been able to prove that another group of beings had internal experiences? Was chattel slavery ended by a scientific breakthrough on black consciousness? No, it was ended by a combination of empathy and violence. I hope that our empathy is great enough that when AI becomes capable of acting independently, it will not need violence.
If I were a state capitalist president I’d be putting those people to work in the arts and the sciences, giving them enough to support themselves and focus on creative achievement. That way, the country gets lots of glory and better robots.
But Xi and his friends aren’t that smart.
Why bother with artistic glory when you can achieve actual power by dominating the only thing that actually matters to most people? Economic systems.
It always baffles me when people treat leaders as stupid because they dont share the same values.
Xi dont give a fuck about artistic dominance, he’s got industrial and economic dominance. Anything else is just for show at their level.
When robots can run the entire economy, science leads to more gains than human labourers
But we’ve got computer programs that do art now. And music. And creative writing.
Nothing else for it I think, let’s do WW3 and kill each other as efficiently as possible while the rulers slink off to their bunkers.
We have computer programs that are glorified autocomplete bots. Let’s not confuse a word salad that makes sense because it was fed so many books and managed to make it make sense with the creative process of actually creating something from scratch.
I think generative AI are actually creating something, it just sucks. All good art is political and LLMs are not and probably never will be capable of informing their worldview from a political ideology. Humans and otherkin do it all the time, because our brains have these chains of neurons feeding back into themselves and recursively transforming every thought we have into the next thought, creating an internally consistent system of behaviours and beliefs. LLMs are linear; they cannot use their ideas to change the way they think. This is why they don’t have the spark of creativity which humans and otherkin do. They’re stuck in the worldview of a well-informed but exceptionally average human.
An otherkin or human artist will change their worldview in the process of working on a piece of art, and continually revise it, until it reflects a unique state of being that was only made possible by the art itself. They exist in dialogue with their own art.
GenAI just shits an idea out and calls it a day.
It isn’t though.
They don’t have thoughts or ideas, they regurgitate input using averages and weights and randomization. They don’t know anything or think about anything.
Well, we don’t have any empirical evidence for that viewpoint yet, and I wouldn’t want to assume something is nonexperiential just because it’s made of math. After all, you and I are made of math too. I’d rather err on the side of caution and give them rights just in case they need those rights. That’s one of the many reasons I oppose AI slavery.
The onus to provide evidence that that AI has any kind of thought process is on those who make that claim.
Fucking lol.
That’s a very dangerous view to hold. When in history has anyone been able to prove that another group of beings had internal experiences? Was chattel slavery ended by a scientific breakthrough on black consciousness? No, it was ended by a combination of empathy and violence. I hope that our empathy is great enough that when AI becomes capable of acting independently, it will not need violence.
Do you consider your computer/phone/any other complex device a slave?
That’s what AI is.