pull down to refresh

Interesting article about integrating AI into workflows. It's about art resources in game development, but the discussion is broader than that. People are worried about losing their literal jobs, including the "loss" from things they love being automated or corrupted.
These are worthy things to worry about, but note the usual ratchet pattern: a person fixates on a role or identity and then jealously guards encroachment on it. This is the fundamental issue. Surface-level critiques, about how AI does a bad job at making art or whatever, are stupid and short-sighted in the usual way. Insofar as they are true, they will not remain true for long. It's an idiotic place to hang criticism.
The better critique is the humanistic one that a bunch of important stuff has no projection in economic analysis.
Things like:
  • feeling useful
  • taking joy in the act of creation
  • self-expression
  • dense interactions with other people
  • a sense of ownership in the process
are not economic goods. At best (or worst) they're correlates (or anti-correlates) of economic activity.
The real issue is that non-economic activity is illegible to the Great Computation that runs the world. What makes it tricky is that these people only have jobs because they've found a local ecology to inhabit that is not illegible, which renders suggestions to burn it all down, or to overthrow capitalism pretty questionable.
After the overthrow of capitalism, the art directors for game companies are probably not going to like their allotments.
Related conversation this morning: #938918
I'm anticipating differences in what kinds of things are short-term problems and what are long-term. Short-term, there will be lots of the kinds of losses you're describing. Personally, I'm looking forward to the automatable parts of my job being taken up by AI. Many of my colleagues like just going through routines established over years of work, but I like trying to answer new questions or explain things in new ways, neither of which are well-suited for AI. Of course, I may just get fired, with the reduction in demand for economists.
Longer term, my expectation is that AI will be so productivity enhancing that people will be able to focus on the parts of their work that they enjoy and find meaning in, even if AI can do it "better" (on some dimension). There are already markets for handcrafted products. Something like that may well arise for services, too.
We're long overdue for a major reevaluation of work-life balance and purpose-wage balance.
reply
Longer term, my expectation is that AI will be so productivity enhancing that people will be able to focus on the parts of their work that they enjoy and find meaning in, even if AI can do it "better" (on some dimension).
I consider myself on the right tail of AI users and what you said here reflects my experience; but I note that it's not the default. Like everything, you can apply intelligence and artfulness to something and become more "human" in the process; or you can do stupid shit and get other results. Given the effort required by the former, I shouldn't be surprised that most perceptions are fixated on the latter.
That said, I'm super sympathetic to the critiques. In addition to becoming an AI cyborg being beyond the intellectual capacity of an increasing number of people (which continues an established trajectory with unsettling moral implications) we're given some complicated philosophical knots to untangle.
For instance, there's a recent burst of people writing blogs / newsletters / comments with AI. Most of these are shitty (see above) but a few aren't, which forces you to get really concrete about what the point is. What does it mean for me to write this, now, wholly out of my own brain, vs as a collaborator with a commodified artificial mind? What do we implicitly expect of each other, when we read thoughts that are ostensibly from another person? What does it mean to have a significant chunk of discussion largely constructed by the same entity?
From the POV of things that I think are interesting having become world-shakingly relevant, it's a great time to be alive. From the POV of wondering whether civilization will survive it, it's less great.
reply
With a lot of the changes we're going through, I think the main difficulty is the rapidity of change being greater than our speed of adaptation.
We could get used to any of these situations, but by the time we do, it's a completely new situation. Perhaps we're just going through something like the industrial revolution and we'll get to settle into some Information Age norms on the other side.
reply
Agreed. Future Shock, 2025 edition.
reply
For instance, there's a recent burst of people writing blogs / newsletters / comments with AI. Most of these are shitty (see above) but a few aren't, which forces you to get really concrete about what the point is. What does it mean for me to write this, now, wholly out of my own brain, vs as a collaborator with a commodified artificial mind?
You can outsource what you want and how much you want to AI. A blog post can be anything from entirely written by AI (prompt: "Come up with an interest topic and write a blog post on it.") to written by you and the spelling being corrected by AI (which is great if you're dyslexic).
I like to write what I like to think of as wholly out of my brain, but use AI for research and fact checking, which before AI I'd have used a non-LLM-based (but still somewhat smart, and increasingly so) search engine for.
One of the things that makes me uncomfortable is that it's often hard to tell to what extent AI was used. I've recently butted heads with a good friend when I realized his messages for me were LLM output. I felt offended at first, because my first thought was that the level of effort I was putting into the communication wasn't reciprocated; that he was going the easy route and not even reading my messages; that he was being lazy. He explained that it was all his own thoughts, and the LLM was just an aid to express them more clearly. I appreciated his explaining himself, because it made me think, but also told him I didn't need that, that he was clear enough without it. But then there is nothing I can do; he'll continue to use it, and maybe one day I'll start using it for that purpose too, just like in the 1990s I might have sworn never to use a mobile phone.
Another thing that makes me uncomfortable is that the content goes through the Big Tech. Unless you self-host, which my friend doesn't; he's not even tech-savvy enough to know that DeepSeek's advantage is in it being smaller and therefore self-hostable, and has zero concern about passing his communication with me through the CCP.
reply
I'll share some thoughts as someone who both writes and codes and uses AI:
I've found AI more useful for coding than for writing. AI doesn't produce good writing. You can easily tell ai-driven writing, and it doesn't have a soul or a personality. It might be good for boilerplate, but nothing else.
Nothing I write on SN is ai-generated. That's because my purpose for writing is to gather and express my own thoughts. Only I can do that, AI can't do it for me.
Code, however, doesn't need a soul. It just needs to get the job done. So i've used AI to write snippets of code and I feel fine about it.
reply
When it comes to writing, I'm thinking about the most formulaic parts. Describing datasets, variables, and summary stats is pretty boiler plate. Having AI produce a draft wouldn't bother me, although it's not something I currently do. I find editing quite a bit easier than writing.
Similarly, some of the stylistic stuff, that varies by journal, would be nice to at least get suggestions on. If they were to train on reviewer comments, it might be nice to get some quick pre-review suggestions, too.
reply
"reduction in demand for economists"?
In academia or private sector or public sector or all of the above?
reply
I don't think AI will create a reduction in demand for economists, any moreso than any other job.
I do think it could reduce the number of junior positions, or another way to say that is that junior positions will require senior level work.
But that's gonna be true across most knowledge-producing occupations, I reckon.
reply
All of the above, because large portions of our work can easily be done by AI. I'm not even sure which of those would see the greatest reduction. My guess is private>public>academia and public would be first if they responded to market signals at all.
reply
I think people can feel useful, feel ownership, and take joy in the act of creation, even if it's AI-assisted.
My reaction is that if you're an artist whose job is easily replaceable by AI, then the pride you took in your work was less about the artistic creation and more about the technical skill of drawing, working with 3D models, etc.
Yes, the AI is probably going to replace much of your technical ability. But it won't replace your creativity, your agency, and most importantly, your taste.
Develop those three latter skills and you should still be very productively employed in the arts, I would guess.
reply
It's interesting to note that "taste" was one of the metrics of advancement in the recent AI 2027 writeup. I think it's a good idea to develop it, but nothing uniquely human about it. No lasting buffer there.
reply
Taste is a tricky ephemeral thing though. Even if an AI's algo can quantify taste, which it probably can (to some extent), will the taste always be backward looking?
Moreover, will AI ever grow a backbone and tell the user that the user's taste is bad? And stand firm and say, "No, we're going with my vision, it's better."
And... even if it does, what if the user is right and the AI is wrong?
So many questions to think about
reply
122 sats \ 1 reply \ @k00b 9 Apr
It won't replace your creativity, your agency, and most importantly, your taste.
Will we all be paid for those? Will we teach them in school? It's difficult to imagine an economy base on taste, where we're all competing on taste, but it sounds awesome.
reply
Would be a mixed bag, imo, resembling the emergent culture around wine snobs and literary douchebags. Taste is so overwhelmingly a cultural signaling mechanism. But maybe something awesome would arise, too.
reply
143 sats \ 1 reply \ @k00b 9 Apr
I told @bitcoinplebdev I was having trouble getting Claude to do this, a DB schema design with on the order of 10 requirements where the context is everything SN does and aims to do in the future wrt money, and he said "it's probably a skill issue." I felt like a dinosaur - or, at least, soon to be one.
The real issue is that non-economic activity is illegible to the Great Computation that runs the world.
To which Great Computation is non-economic activity legible? Is money still the 1's and 0's of this machine? It's interesting to think of economics as being irrelevant because we have nothing of economic value to trade. What replaces it as the self-organizing force?
reply
To which Great Computation is non-economic activity legible?
Jesus, that's a good question. Probably there is no such Great Computation -- the computational substrate that cares about that is the human cortex, and the pre-frontal parts of it in particular; is there some giant externalization of our social processing that could be encoded, somehow?
Money is the closest thing, and it's good that there is some rough correlate of some of the ways people value each other that are non-economic. But it's also bad, because people get blinded to the fact that all good that emerges from the coordination money unlocks is the only good there is. It's sort of awesome (in the literal sense) that despite counter-examples surrounding them from their first moments on earth, they can forget it anyway.
It just goes to show the power of the Great Computation.
reply
113 sats \ 3 replies \ @gmd 9 Apr
It's pretty crazy how quickly art has been commoditized. I was recently playing with 4o to create some quick cartoon medical characters for some infographics and they were much better than I could have imagined (and much better than when I tried this ~6 months ago)
I was delighted then saddened realizing that the work of an human illustrator has been so rapidly devalued. RIP fiverr artist jobs.
reply
Please tell me wtf you were illustrating with that second one.
reply
101 sats \ 1 reply \ @gmd 10 Apr
gram negative bacterial causes of bloody diarrhea :P
reply
Your people are lucky to have you, whoever "your people" are.
reply
One would hope that AI would allow them to be even more productive and creative but I get the concerns about embracing the thing that might replace you.
reply
It requires an almost religious Leap of Faith to behave as if you will find a new way to inhabit the world, and should embrace your own obsolescence. In some environments I can attest to that attitude being richly rewarding, but I'm sure there are some where it's not.
reply
Agree. Humans don’t like uncertainty or change.
reply
I liked that last bit... "After the overthrow of capitalism, game company art directors probably won't like their assignments"...
I don't trust AI.
reply
Trust is an odd word choice. What do you mean?
reply
I'm aware that AI will continue to fill spaces and facilitate more activities as we move forward... But I'm more of a classic type... I mean, I really like manual things... I trust myself...
reply
People will always be scared of progress.
reply
AI’s role in creative fields definitely stirs up a lot of fear, especially when it comes to job security, but I think the deeper worry is about losing the meaning behind our work. It’s not just about the task itself, but the feeling of creating something, of expressing yourself, and the connections that come with it. The problem with a lot of critiques of AI is they focus on the surface, whether AI can do the job well today, when really, the question should be about the bigger picture: how do we find value in things beyond the paycheck? Work should be about more than just economic output. It’s about finding joy in the act of creating and feeling connected to something meaningful.
The thing is, these concerns often stem from a place of wanting to protect a system we’ve built. But jobs are never as stable as they seem, and we should be thinking more about how we can reshape these systems to support people’s emotional and social needs, not just their economic ones. Ultimately, it’s not about whether AI takes over certain roles, but how we redefine success and fulfillment in a world where technology is constantly evolving. If we can focus on creating spaces for people to adapt, express themselves, and find real value in their work, then we’re on the right track.
reply