2026-02-20 11:09:00
AI can't write like me. It can't write like you. It can't write like any human, really. The only thing it is good as is bullshit PR speak, which is meant for companies that are not human. This is it's purpose.
AI can write legibly. But it won't go off on a random tangent halfway through a blog post (certainly not with parenthesis around it) and then reel itself back in.
AI isn't funny. It has all the charisma of a caricature of a Millennial written by an aging hollywood executive who's best stand up is a series of shitty knock-knock jokes.
AI can't showcase your passions like you can. It can't connect with your passion that has been formed since childhood.
AI can't be real with you. It'll shove out the most palatable thing out because it is programmed to statistically write what YOU think you'll like. It doesn't consciously wonder about how its words will be perceived.
I think there are a lot of people who are worried AI will copy their style. Which leads me to believe that these people sanitize themselves.
Writing is art, in order for something to be art it has to tap into the human experience. That's why conservative art is so dogshit, it's reactive rather than contemplative.
AI won't be able to capture your essence. Something will always be off about it. Even if it comes close, there's something in the way that will stop it from actually sounding human. It can only copy and paste short snippets at best, but it can't write something original from it's own mind.
AI can't write like me, and it sure as hell can't write like you.
{{ previous_post }} | {{ next_post }}
Reply via email: [email protected]
Just got done playing Halo CE with my father in law. We're doing a run of all the Halo campaigns (going from Reach, CE, 2, 3, ODST). We beat Reach not too long ago, and I had the idea of us doing CE couch coop. Then for Halo 2 we do it through MCC so he can see the updated graphics (since he's never played MCC).
Also put an order for a 7th gen iPod, got a good deal on it. About $20 below normal pricing. Looked like it was in good condition, all I needed was the motherboard anyway. Also got a new stylus for my turntable. Hopefully it'll be a nice upgrade. It hasn't came in yet, but I'll probably post something about it on my microblog when it gets here.
Did lots of drumming today. Had fun jamming to various songs and trying to improvise drum beats to songs I haven't fully studied. "Ain't It Fun?" by Paramore, "Armatage Shanks" and "The American Dream is Killing Me" by Green Day, and "Up From the Bottom" by Linkin Park were what I played.
2026-02-20 06:26:00
Every old Christian person I know (which is basically my entire family) thinks it's the end of the world. They're convinced that we're living in "perilous" times and that it's a wrap. Get ready. Christ is coming! (Although they say this every decade.)
Even outside religious circles, I've noticed this energy of doom with everyone. People's views of the future are bleak. They're not sure what's on the horizon, but they don't think it's anything good. Everyone seems to think we've reached the "end" of something, and even though they might not say the quiet part out loud, you can tell they're thinking it.
But honestly, I'm tired of the doom and gloom. It feels very first-world/Americanized, as if there aren't other people and countries who are already experiencing their own end of the world (Palestine, Sudan, Ukraine).
And speaking of Ukraine, all this reminds me of an article I read when the war first started there. When the whole world was still in shock, The Guardian1 published an article where they interviewed Ukrainian college students about their thoughts about the war. Their outlooks were bleak, with many of them saying their futures "are gone now."
They all thought that this was the end. But, four years later, we all know that the world didn't end for Ukraine. They're still fighting and dying and the reality is that, when things go wrong, the world doesn't end, it just keeps going. The only difference is that reality changes to a newer, darker, shittier one. And, frustratingly, everyone just has to learn to deal with a new normal.
About a year after the war in Ukraine started, I had a client from Ukraine reach out to me for help with their web site. It was a basic Shopify site selling some beauty product, but I remember her telling me in an email that sometimes her country's Internet goes out and to not worry if I don't hear from her for days. I thought, "Wow, I can't imagine living that way," but that was the everyday reality for her and millions of others. Their world didn't end, they just now have to dodge bombs while the Internet cycles in and out. Oh, and they still have to go to work, too.
The last part is important because I suspect a lot of people are secretly hoping for the end of the world so they can get out of work, which sounds ridiculous, but it makes sense. I remember in 2024 when everyone was freaking out about drones and people were questioning if they were from another planet (some jokingly, some seriously). I came across a comment that said, "If the aliens invade, can they do it before I have to go to work on Monday?"
All of this just feels like suicide ideation at a global scale: we just want something to take us away from our lives. The reality is that we're not special, and entire worlds won't end for us the moment we're inconvenienced.
And that Ukrainian client of mine? She never got back to me.
I unfortunately can't find the article for the life of me.↩
2026-02-20 00:55:00
I’m going to delete my Discord account, that’s all but certain. I just haven’t gotten round to it yet. The moment BlueSky asked me to complete age verification I deleted my account. There was no hesitation and even a sense of relief. I’d been pushed into doing something I wanted to do anyway. It’s different this time.
Although I’m only part of one Discord community now, that’s exactly what it’s been, a community. Sure there’s features that are all in one place that many of the alternatives don’t have in full, but that’s not what I’ll miss.
Being part of a community that has its own history, its own inside jokes and unique relationships isn’t something that can be replicated. Some people will stay, some will move to one platform, others will land somewhere else and there’s those that will give it up all together. The community I’m part of is centred around a podcast that I’m a fan of so it’ll no doubt keep going but I won’t be a part of it.
So whilst there are alternatives, I’m not after an alternative. I want to preserve what I’ve loved being a part of but things won’t be ever be the same. There’s not really a point to this, just putting down how I’m feeling.
Hails!
2026-02-20 00:15:00
technology for earth, community, and the future
note: this text revises an earlier version of the solarpunk ai manifesto. the original remains archived here. this revision reflects further research, sharper language, and a more careful grounding of claims. parts of the earlier manifesto leaned on directional signals and aggregated summaries. for instance, an industry sector pie chart in the previous version suggested a level of quantification that the evidence does not firmly support. this revision removes that visual and grounds the argument more carefully in available research. i made this revision because ideas evolve, and so should their articulation.
the current path: we are seeing increasingly centralised ai systems, expanding data centre infrastructure, rising energy and water demands, and compute and capital-driven rather than human and environment-centric use of ai.
result of the current path: mounting ecological and social pressures. if this type of growth continues without constraint, we are in trouble.
a truth: this trajectory is not inevitable. energy use (or sources), governance, and deployment models are political design choices, not natural laws.12 the current reality of ai is shaped by poor judgment, horrendous decision-making, and tech-bro economic policies tethered to late-stage capitalism within the current technocratic and algorithmic dictatorship.3
today, social media platforms are major users of applied machine learning systems.
recommendation systems, ranking models, ad targeting, content moderation, and notification systems all rely heavily on machine learning infrastructure. exact global shares of ai usage by sector are not publicly disclosed, but personalisation and recommendation systems are among the most widely deployed ai applications worldwide.45
across billions of users, this results in continuous large-scale inference workloads being distributed across global data centre networks.
most platforms optimise for engagement, advertising yield, and growth. the ecological costs are rarely a primary optimisation metric. profit before the environment and people seems to be the persisting model.
note: as an anarchist, i reject all rigid intellectual property regimes. yet rejecting copyright does not require accepting unconsented appropriation. authorship can be understood not as ownership in a capitalist sense, but as a form of relational integrity. creative work emerges from lived experience, labour, and context. its incorporation into machine systems without consent or acknowledgement presents a moral tension that cannot be dismissed, even though it arrived as a technological inevitability. it is reasonable to argue that a substantial share of frontier ai deployment today is directed toward advertising, recommendation, and consumer optimisation rather than ecological restoration. much of this development has relied on large scale scraping and ingestion of cultural material without explicit consent from creators. practices that would be contested or restricted at an individual level are often normalised when conducted at corporate scale. this asymmetry raises serious ethical concerns about power, accountability, and who bears the consequences of extraction.
the issue is not simply legality. it is structural. corporations operate within regulatory grey zones, while individual creators rarely have the resources to challenge appropriation of their work. the result is a system in which creative labour is absorbed into industrial-scale machine systems with limited transparency and little reciprocity.
| dirty ai | green ai |
|---|---|
| centralised and opaque | transparent and accountable |
| growth driven optimisation | ecological constraint aware design |
| compute maximisation | efficiency and sufficiency |
| concentrated ownership | community and public governance |
public reports do not provide a precise global breakdown of ai compute by sector. however, surveys of enterprise adoption show high uptake in marketing, sales, operations, finance, and customer experience functions.45
energy, agriculture, biodiversity monitoring, and climate applications represent a smaller but growing portion of documented ai deployment.
the imbalance is qualitative rather than precisely quantified. commercial optimisation dominates current large scale deployment of ai systems. ecological and public interest applications remain comparatively underfunded.
this reflects the political economy of the industry. as ai capabilities scale, so do valuations and executive wealth. capital concentrates upward while ecological crises intensify. intelligence at planetary scale, artificial or otherwise, is presently engineered to multiply capital more reliably than it multiplies care, resilience, or shared wellbeing.
we can do better than this.
| currently dominant uses | possible reallocation |
|---|---|
| engagement ranking systems | local renewable grid optimisation |
| ad targeting | biodiversity monitoring |
| high frequency consumer prediction | climate adaptation modelling |
| attention engineering | public knowledge infrastructures |
| extractive data mining | community health and resilience systems |
| speculative financial modelling | cooperative economic planning tools |
| automated consumer persuasion | participatory civic decision systems |
| proprietary black-box models | transparent and auditable community models |
| centralised hyperscale data centres | distributed, community-scale compute |
these applications already exist in early forms within conservation technology networks and climate focused ai initiatives.910 but instead of being experiments in ai altruism, they need to be championed as the core goal.
another issue is that as compute models force centralisation and drag us further into the cloud, we are seeing the tech industry turning into a cloud-oligopoly effectively attacking local computing11 and therefore decentralisation. without decentralisation, the utopian dreams of a solarpunk world assisted by ai are not possible. so one immediate milestone is to decentralise the world. decentralise the future.
we must appropriate machines not be appropriated by them.
guattari (1989)
the very notion of the domination of nature by man stems from the very real domination of human by human.
bookchin (1982)
ai systems can either intensify extraction or support regeneration. they can remain centralised behemoths, corporately controlled with minimal communal input, or become decentralised, publicly accountable, and collectively shaped. the direction depends on governance, infrastructure, and shared priorities.
the choice is still open, though the window narrows as infrastructure hardens and power consolidates.
avoiding ai will not protect us from it. we need to understand it. we need to experiment with it at human scale. small projects, local experiments, community tools such as my own thought experiment, the solar grove. literacy is a form of agency, and we need agency in this game.
democratising this technology means more than open access. it means redistributing knowledge, compute, and decision-making power. it means building systems that serve ecological stability and communal resilience rather than short-term and destructive extraction.
ai should not remain concentrated in the hands of a narrow technical and financial elite. like the early internet, it can either consolidate into enclosed platforms or evolve into a distributed commons.
the future of intelligence is not predetermined. it is negotiated. we need to play our part in this negotiation.
strubell, e., ganesh, a., & mccallum, a. (2019). energy and policy considerations for deep learning in nlp. acl proceedings. https://doi.org/10.18653/v1/P19-1355↩
oecd. (2022). measuring the environmental impacts of artificial intelligence compute and applications. https://www.oecd.org/↩
hart, m., bavin, k. & lynes, a. artificial intelligence, capitalism, and the logic of harm: toward a critical criminology of ai. crit crim 33, 513–532 (2025). https://doi.org/10.1007/s10612-025-09837-0↩
stanford institute for human centered artificial intelligence. (2025). ai index report 2025. https://hai.stanford.edu/ai-index↩
mckinsey & company. (2024). the state of ai in early 2024. https://www.mckinsey.com/↩
international energy agency. (2024). electricity 2024: analysis and forecast to 2026. https://www.iea.org/↩
reuters. (2025). reporting on indirect emissions growth among major technology companies linked to data centre expansion. https://www.reuters.com/↩
nature communications. (2022). research on social media, machine learning, and behavioural effects. https://www.nature.com/↩
climate change ai. (2019). tackling climate change with machine learning. https://www.climatechange.ai/↩
wildlabs. conservation technology community network. https://wildlabs.net/↩
velazquez, jason. the computational web and the old ai switcharoo. https://www.fromjason.xyz/p/notebook/the-computational-web-and-the-old-ai-switcharoo/ (tab:https://www.fromjason.xyz/p/notebook/the-computational-web-and-the-old-ai-switcharoo/)↩
2026-02-19 23:01:00
For the Bearblog Creation Festival, I drew tiny pixel bears1 inspired by the blog themes and art of fellow Bearbloggers. Can you guess who inspired each of these?
P.S. Some bears look better in dark mode. You can toggle a dark theme with the menu in the upper right.
(I had a lot of fun drawing these and might add more later.)
==I welcome others to make these too! Here's a template if you're interested. :) (Let me know if you create one!)==
The pixel bears are 48×48 but enlarged to 144×144 here. I think they're more fun to look at big.↩
2026-02-19 16:30:00
You're absolutely right! In today's rapidly evolving digital landscape, the way we create and consume content is undergoing a seismic shift — one that represents a pivotal moment in the broader conversation around authenticity. It's not just redefining what it means to write: it's fundamentally reshaping the very fabric of creative expression as we know it.
I'm so sorry you had to read that, but apparently an increasing number of people are not sorry to "write it". AI has brought the effort in manufacturing endless quantities of poetic slop down to nearly zero.
Everywhere you turn now there is another AI slop post to read. At this point we've almost all but removed the humanity from platforms like LinkedIn, if there was ever much there in the first place.
The issue with AI content is not that it inherently reads badly. It's that for all of its ability to produce elegant prose, it does so at the total expense of any actual meaning.
The most offensive part for me as a reader is that someone has produced content they themselves spent less time producing, with less original thought, than the reader they expect to engage. The disrespect of treating your reader to the output of your one-sentence prompt fundamentally undermines the purpose of sharing your thoughts in the first place. If you have something to say, say it. Don't let the AI speak for you. If I wanted to hear Claude's latest opinion, I'd ask it myself.
Of course we know why people resort to producing AI content: it's easy, and the goal is cheap engagement, usually to funnel you into some product or service. The irony is, when something has been so blatantly written with AI it actually just undermines everything you're trying to do. I, like so many others, simply disengage entirely.
The other issue with this, when everyone's feeding prompts into the same few models, you end up with a sea of content that all sounds identical. Millions of people publishing the same voice saying nothing. The whole point of putting your thoughts out there is that they're shaped by your experience, your perspective. Strip that away and what exactly are you contributing?
Now to be clear, I'm not talking about the odd use of AI to help rewrite a sentence or spellcheck - or even just for helping edit a draft. What I am talking about is the near wholesale production of AI slop. These tools are supposed to raise the bar, not lower it. Yes, your post looks beautifully written, but I'd frankly take the typoed, broken English version of a genuine novel insight over ChatGPT's perfectly polished "nothing" any day of the week.
I'm all for AI being used to boost productivity. There is absolutely a time and a place for it, but producing the content that is supposed to represent your thoughts and your expertise? That isn't it. If you've got nothing original to say, no amount of AI is going to fix that. Authenticity is the new order of the day.