A Duke/Fuqua study found that people who admit to using AI are rated by their colleagues as lazier, less competent, and more replaceable. You did the job faster and better, but you're considered worse because you didn't suffer.
pull down to refresh
pull down to refresh
A Duke/Fuqua study found that people who admit to using AI are rated by their colleagues as lazier, less competent, and more replaceable. You did the job faster and better, but you're considered worse because you didn't suffer.
That really depends on how you use it, doesn't it? Looking at what I see in the field, most people use it as a way to be more lazy and faster. So this makes sense?
That’s exactly the point, but there’s a psychological reason behind it called the Effort Heuristic. Back in 2004, psychologist Justin Kruger showed that people rate the exact same poem significantly higher if they are told it took 18 hours to write versus 4 hours. We have a built-in 'bug' in our brains: Effort = Quality. If you didn't suffer, the result feels 'cheap' or 'undeserved.'
I do not contest that that heuristic is real or that it gets applied with extreme prejudice.
The point I am trying to make is that the prejudice in the current yolo climate, especially in code, is actually effective because it has a good hit rate. Most uses of AI actually are lazy, and I'd dispute that it is better in most cases. That's where the friction is at: if your expectation is that your life is going to get easier because you're using AI, think again. It's going to get significantly harder, because:
ralphand and have IQ 0 do everything 100x and maybe you will get lucky. Remember though that in your 3 layers of yolo IQ 0, if you need to execute 100x at the top layer, your potential expense be 1,000,000x than doing it right one-shot.So perhaps we ought to ask the question: do we really want that perception to change? Are you ready? I'm not.
I thought laziness is a virtue in software design. Certainly, I can see how my own laziness is why I like to write things that will have more permanence and reusability -- so I don't have to do it again.
I'd say that in software, not doing any abstraction is lazy. Not writing unit- and integration tests is lazy. Hard coding stuff is lazy. That's not a desirable quality you search for in a designer or developer unless you just got some funding and now you got to ship before the runway ends. But while that may be the more visible part of software design, that's not the norm. Many more people are employed in mature businesses than in startups.
However, the low standards software is definitely the norm right now. Because it used to be too expensive to reproduce. Vendor lock in was real. So devs can be lazy. Designers can be lazy. What I'm saying is: not for long. Because with 100 lines of thoughtful markdown any fool can break that lock-in. That's why Wall St. is selling software stonks and if I'm honest; they're not wrong, because the luxury position of obfuscation and lock-in really is over now.
You made me rethink.
Yes, I am lazy, but not that lazy, and I'm also forward looking.
That's why I spend time on abstraction, because I want to allow myself a bit more laziness in the future
That being said, in my line of work (academic research, not production code), I've learned that for much of the abstraction, the juice is often not worth the squeeze.
Agreed on the vendor lock in stuff. The barriers for a small business to write their own bespoke software has gone down a lot
In production code this can be the case too, it depends on what you're building for. You have to be smart about what you spend the effort on. If you abstract everything away, you're overengineering. If you just C&P everything, you're underengineering. There's a sweet spot depending on what you're after. When there's no threat because it's expensive to replace your product, it's easy to slack off. The proprietary software space is filled with this. I know because I've worked in it for decades, always battling both the workfloor and the customer to get to maintainable software. No longer moving in that space - in favor of doing volunteer open source work - was the best choice I ever made (except for slowly becoming a pleb, lol.) Because when you develop in public under a permissive license, you have to make good choices. There's no obscurity because anyone can just fork you away with a click of a button. Much less trickery.
I've had real fun in the past helping mid-size or even larger orgs deal with lock-in as gig work. Just relentlessly produce software that utilizes every API it can, or even scrapes the shit out of web forms, to get stuff done right. I no longer get that type of gigs though - and if I did it would be a couple days of work instead of a couple of weeks. Much has changed.
Everyone who uses AI at my job is exactly like this. I'm not going to get into whether AI is the cause right now, it's more of a symptom.
Some of them have just become agents for AI workers, it's ridiculous.
I venture to guess it matters a lot on how the people are using AI, and also what field.
I think there's a lazy way to use AI and a non-lazy way.
That's a generational issue and kickback from older workers who didn't have the advantage of AI or tech at their fingers. In reality, much of that sentiment will fade and become irrelevant as we move from a salary compensation work world to a per diem. Basically, getting paid by what you literally produce. It will be much harsher than most folks realize. More like the 1930s.