pull down to refresh

@nickteecee and I were discussing iOS development yesterday. I gave him the old, "back in my day we programmed iOS apps in Objective-C, before automatic reference counting, and without using the autorelease pools that never really made sense!"
I then made the unhinged claim that Objective-C was the earliest popular Object Oriented Programming language. While that's debatably true, Obj-C came into being only a few years earlier than C++, both were preceded by SmallTalk, and SmallTalk was preceded by Simula.
I'll spare you my full rant about the overuse of OOP, which provides fantastic utility in very specific applications yet infects every programmer with a desire to use the pseudovisual object metaphor where it makes no sense. Instead, I'll leave you with this two-part talk from Alan Kay, who co-developed SmallTalk at Xerox PARC, titled How to Invent the Future1:

Footnotes

  1. This talk was given as part of Stanford CS183F: Startup School taught/organized by none other than Sam Altman then president of Y Combinator. The Stanford course would later become Startup School, which is sort of one-part course, one-part YC training camp.
Those were Jonny-come-lately programming languages. I started out with spaghetti basic and Fortran IV, when there were no objects. After programming with those, wrapping my head around objects was somewhat difficult. It was worth it though, reusable objects and calls are nice programming features when you need to do something quick and find the bugs even quicker.
reply
I'm still stuck with some Fortran 66 and 77 code snippets here and there. But I rewrote most of the legacy code I use into Fortran 90 syntax. Yet, I rarely think I'll be able to leverage it in industry. Also, no OOP in my codes...
reply
Is Fortran still pretty standard in physics? I heard of some Econ profs who used to use Fortran, but Matlab became the default for a while and now I'm not sure but I think most of the young'uns use Python
reply
Some of my younger colleagues are allergic to Fortran, yet they're always happy when I implement their Hamiltonians in my Fortran code. It's just so much faster. For production calculations, it is still pretty much the standard. But for everything post-processing or quick stuff, we use Python in my field. Some colleagues use Matlab too, but it became less common when numpy, scipy, etc matured in the last 10-20 years. You can get very far with Python, if you know how to use the proper libraries (that are actually written in Fortran, C and C++), but many of my colleagues don't bother learning about proper use, and thus their codes are pretty slow.
As a side note, ChatGPT is pretty good at explaining/solving bugs in Fortran, as it has likely been trained on all the old Fortran documentation. It saves me time. However, it is not good at implementing ideas that I have. Probably not enough Fortran code training data on Github, etc. Python is a much better fit for ChatGPT companionship.
reply
From what I can tell, Python is what they are starting the new programmers out in. I had to learn it to script for networking and cybersecurity uses. Couldn’t use the old languages any more!
reply
The OOP helps out for reuse or multiple uses in one program, in my case. I do most of my scripting in BASH, C-shell, python or on MicroSloth’s new fandangle of a terminal power command line.
reply
I've gone back to shell scripting thanks to ChatGPT. At some point, I stopped bothering due to its annoying syntax and I would switch to a Python script to prepare input files and launch external codes.
reply
Oh … I haven’t gotten around to using ChatGPT, yet. Don’t really know if I want to or not, due to the training and who is doing the training. I think the results might be somewhat biased and not quite correct. Python would be a good one for input files. I am much more familiar with shell scripting than otherwise.
reply
For me, ChatGPT is quite useful. Because I learned how to code before it arrived. However, with some of my younger colleagues, I believe ChatGPT is a curse. They blindly rely on what ChatGPT tells them and are unable to assess if what is given to them is correct or not. And when I ask them how they implemented something but they don't know because they don't understand what the code does, it's clear ChatGPT is a net-negative for them.
reply
Yes, the programmer should know how his program works, if he is ever to figure out how to fix problems with it. Just letting ChatGPT do all the work can be done by anyone familiar with making prompts for ChatGPT. What good is that?
reply
I'm really concerned for this generation of students who are over reliant on AI.
But on the other hand maybe that's what older generations thought of calculators.
Every generations learns to do the tasks it needs using the tools it has available I suppose
reply
Time will tell. Can't fight it, anyhow, best is to embrace it and make the best of it.
I guess that is what many people also thought of computers, spreadsheets and accounting software. I can tell you from direct experience that accounting software using accountants sometimes have a hell of a time finding their errors. They just are not sure of their debits and credits and which goes where. Then there is the transposed number problem that is difficult to find when using software. Perhaps every profession has the same problems.
Which tool do your students use? ChatGPT?
I'm impressed your students are 'over reliant' on AI lol
I think I am under reliant on AI
I'll spare you my full rant about the overuse of OOP
Please, I would like to hear.
I recently had to slog through some super hard to read code because 90% of the code was templating out abstract objects with multiple levels of inheritance, just to perform some simple operations like an API call. I sort of understood why the abstractions were useful, but at the same time it made figuring out what was going on really hard.
And this was a Microsoft repo so I'm guessing it's good programming practice? At least I assume Microsoft engineers are considered good in the industry
reply
236 sats \ 0 replies \ @rblb 29 Nov
With good OOP you shouldn't even have to read the actual code 90% of the time, just the abstractions (unless you are bugfixing it).
But an OOP language to be good it needs to be very strict and verbose. Java imo is (was) the best OOP language, C++ being the worst.
reply
100 sats \ 1 reply \ @k00b OP 29 Nov
I felt I may have overstated how much of a rant I have, but it looks like I didn't ...
OOP is best suited to problems where you have lots of varied and complex state and many instances of complex state. The history of OOP certainly suggests that at least - it was initially used for simulations, ie many instances of varied complex state evolving over time. Then it found market-fit in GUI frameworks which have a similar problem. Then programmers, with this hammer, treat everything like a nail. They use OOP to solve problems with primitive and singular state, tightly coupling logic to state merely out of habit.
OOP programmers spend most of their time building more and more complex, and abstract, families of logic-state chimera. As a result OOP code tends to be the hardest to read and reason about because at any point of execution there's a bunch of implicit bespokely organized state. OOP encourages state and state organization rather than state minimization. It rejects electing the simple, predictable, and obvious execution stack from carrying a program's state. When the execution stack carries state:
  1. the complexity of a program's ephemeral state is roughly proportional to its execution depth
  2. state is defined by simple, often primitive, explicit statements exactly where the state begins being used
I think what OOP wants to be is declarative (ie understand these classes, objects, and organization and you know what the code does), which is the way most programming languages and frameworks have evolved, but instead often ends up being the opposite - state mutates state that mutates state that mutates state, times a million, and all according to some random OOP engineer/state architect. Very few problems require ephemeral, complex, kind-numerous, interdependent state like this.
reply
OOP code tends to be the hardest to read and reason about because at any point of execution there's a bunch of implicit bespokely organized state
Yep that's my experience too
reply
Really? You consider MicroSloth engineers as good programmers? The only place I would ever use MicroSloth products is in a virtual box after I have acquired them for free. Just don’t trust them.
reply
I would assume that Microsoft programmers are considered good because Microsoft offers high salaries. Doesn't mean their products are good but I would have assumed the programmers are skilled and the product issues are more due to incentives at a large org. I don't work in the industry so I wouldn't really know
reply
I am not sure that high salaries equate to excellent programmers and engineers. Perhaps large monopolistic companies have too much spare cash that they have to distribute, or they get accused of monopoly profiteering. There are lots of reasons for paying people highly and not receiving commensurate services. Could that be one of the reasons for the high salaries?
reply
The best way to invent predict the future is to create it lol
reply