To begin with, I think it's important to recognize that we are in a hype cycle. Large language models aren't new, but the willingness to scrape any dataset that can be found and funnel them through large cloud providers has shown that extreme applications can come out. Eventually the bubble will pop, and we'll be left with some things that stay, and many dreams that don't.
The good:
  • GitHub Copilot's inline code suggestions are often quite helpful for "boiler plate" activities. (The mundane stuff you always write in code but doesn't do anything particularly unique or interesting)
  • Generating fluffy language where expected. It's better than me at quickly generating polite text for emails or other places where some amount of bulk is expected.
  • Helping brainstorm possibilities. Used as a bouncing off partner when you already have knowledge and opinions, it's strong at helping you draw out your own thoughts.
Copilot Workspace as described in the article sounds less like an assistant, and more like a driver, and this is where things get dangerous. Selling points like "help plan, build, and test" and "biggest point of friction is knowing where to get started" are big red flags to me. I don't believe that LLMs can do a good job serving as your tech lead or mentor. They are by definition, as the kids say "totally mid". They are statistical models that are most familiar with the things they are most fed in their training sets. Once you get beyond the trivial and the demonstration applications, the façade of this responsive expert transforms into a know-it-all intern who forgets what they just said, and changes opinions upon first critique. I feel really bad for the new talent entering the programming space now, potentially learning this way.
You might say, that's because things are in their infancy. When we get larger models and more powerful processors behind them, things will only get better. I reserve the right to be wrong, but all the calls for these things to "grow exponentially" imagine the hockey stick graph, but I think it's more likely the stick should be flipped into a logarithmic graph. We'll probably need more and more server power to get less and less perceived improvement. Githup Copilot allegedly loses an average of $20 a month per user already. So either they think there is a multiple of efficiency coming in the near term, or we developers are actually the product. In either case, I expect the price to go up, the quality to go down, or both. Add onto this that the Workspace product does not come with IP indemnity, and there is additional stench that the lawyers are concerned.
I also see this trend as a new iteration of the low code / no code trend which has been shown to generally fail for a sufficiently complex use case. This idea that you can speak natural language and code will come out suggests that most of the current software development processes of the world are a sham. Why do BAs and requirements gathering processes exist at all if it is feasible to take the words that make up a business user and turn that into code? The truth is that a large part of software development is figuring out what you actually want to do in the first place. We all are guilty of it. We think we know what we want, but when confronted with the details of its implementation, it's no longer obvious what the right plan is.
A use of LLMs in a theoretical slam dunk case had it searching open source repositories for known security vulnerabilities. It would submit a pull request and respond to maintainer questions. While I'm sure there were legitimate issues and patches submitted, there were many that were actually incorrect. Already busy and stressed open source maintainers have to argue with a machine for a few iterations before realizing that they are actually being spammed.
After all this skepticism, I think it's important that I reiterate: I do think useful things will come out of this. Right now though, I think because the hype is so high, we're throwing LLMs at every problem as it seems like magic to most. Eventually the powers that be who are throwing tons of funding at this will flip the monetization switches, and we'll see what is viable, and who gets rugged. In the meantime, I'm concerned for my profession as it seems that a lot of newcomers are giving up the learning process and handing things over to the tool.
Thank you for the awesome write up, I agree with a lot of your opinions, this was super interesting!
reply