Public
Federated
Thread

Q.U.I.N.N.
teddy casually recommending accelerationism as a way to collapse industrial society 


@icedquinn@blob.cat
> accelerationism
I think that is why he is so popular. And, so often recommended to people who are discontent with the current situation.
But, accelerationism is cursed.

@hazlin@amala.schwartzwelt.xyz he does arrive there from the list of basically
- new technology is 'optional' until it isn't, so you don't have the option to peacefully opt out
- gentle resistance is unlikely to work, due to immediate economic uses of the tech and because deindustrialism would be painful
- hard resistance is met with exceptional resistance from "technophiles"
- resistance is unlikely to work unless The System is already at a critically unstable moment where it could collapse on its own anyway
then just throws out the line (without actually calling it accelerationism) that tl;dr forcing it to advance too fast increases the odds it just kills itself with instability.
i guess he's not really wrong. if people won't stop playing with the devils toys, and won't choose to be better people with them, your remaining options are be eaten by it or crash it in to the ground.

@icedquinn@blob.cat
When it is framed that way, I think people find it compelling.
But, from my perspective, the accelorationists don't actually help the situation. And, the overlords, are already pushing as much as they can. And, sometimes they go too far, but they have enough control to slow things down and keep an unplanned collapse from happening.
In the mean time anyone engaging in accelorationism, just becomes a terrorist to innocent people.
They trade their goodness for evil. Surely that is a cursed upon themselves and others. Just like Ted, and many copy cats sense Ted.

@hazlin@amala.schwartzwelt.xyz I don't think the control grid is pushing as fast as it could be done. We know that companies like Intel intentionally withhold technology for profit (the "tick tock" process.)
tinfoils suspect that other places are witholding technology because they are not sure how to ensure control over it. we saw a glimpse of this with companies being very open about AI—then very closed—with their openness falling very closely among lines of "we don't think anyone will understand this so its safe to publish."
Google's own internal leaks seem to suggest this to be the face. Facebook and Google did not believe anyone could understand their works—so they published to make the academics happy. Now Google is big mad because autistic NEETs are not only understanding the papers but are outperforming dedicated AI teams on refining the techniques, we suddenly saw the big calls to regulate it back in to the ground.
they seem to have a chosen speed based around retaining total control and the thing about accelerationism is to push the system at a velocity above their stable tempo.
you almost saw this recently. with trump winning "unexpectedly" and the subsequent catch-up games to speedrun fascism, and israel desperately shitting out PR and spending influence in bulk, they were indeed pushed beyond their safe tempo and the grid started showing overt cracks this past year

@icedquinn@blob.cat
Those are good examples of, the system moving too fast, and control slipping.
I guess I am not arguing that it cannot happen. But, instead that, blowing up your neighbor or shooting people at a gas station, does not contribute to the destabilization of an overlord government.

@hazlin@amala.schwartzwelt.xyz i can't speak for accelerationism (this is just book club thread for his book
) but teddy himself mostly mailed bombs to CIA operators involved in the program he was in
he hasn't advocated (i mean theres an hour left in the audiobook—still time) just randomly bombing stuff.



@icedquinn@blob.cat
> teddy himself mostly mailed bombs to CIA operators
A lot of people on the official list, don't seem to have any value as a CIA operator.

@hazlin@amala.schwartzwelt.xyz he was one of the people involved in some program where they were experimenting on the effects of abuse and what the limits they could get away with psychologically abusing people was, IIRC.
i'm gonna leave whether they deserved or not to someone else—apparently some really uninvolved people got hurt too—but spooks come in every shape and stanford especially is extremely strategically relevant to sociopaths

@hazlin@amala.schwartzwelt.xyz most every evil thing you can think of since the 90s has passed through Stanford SRI's doors etc

@icedquinn@blob.cat honestly, this is quite the twist! I had never heard about this aspect xD
He has always been presented to me as, someone who used terror to promote his thesis.
I didn't even consider that there could have been more.