50 sats \ 3 replies \ @orthwyrm 20 Jul 2023 \ on: RANT: Cryptography and Algorithm Papers Should Use Code, Not Math Notation tech
I don't work in cryptography or academia, but I do develop algorithms for image processing and stuff like that. I find it helpful to both provide mathematical notation and coded example when documenting my work.
Mathematics provides the detail, but as you say it's quite hard to interpret. A coded example is more human-readable and can provide a nice overview, but it sacrifices fidelity (as you need to be very familiar with the language and functions to understand exactly what is going on under the hood).
Perhaps academic papers need to remain lean and minimize word count if they're being published in journals? I could see why coded examples would be neglected in this case.
I agree that formulas are necessary for things like signal processing systems, even the dynamic difficulty adjustment, with its derivatives and integrals needs some calculus notation.
But all the set theory, especially, it is not intuitive what A, B, C, F M or whatever letter you use refers to. You have to memorise a code to read it, which means that this notation is essentially encrypted.
All of it much more easily understood in venn diagrams and other visuals. And words can describe it really well too, even more compactly, but without that encryption. It's always the set theory stuff that is the most sleep inducing for me. And the more complicated calculus is beyond my highschool training, yet I can still write the algorithm for it if it's broken down into a linear set of operations.
reply
A coded example is more human-readable and can provide a nice overview, but it sacrifices fidelity (as you need to be very familiar with the language and functions to understand exactly what is going on under the hood).
The solution to that is to use a very well-specified programming language that hasn't changed for decades either because it's been abanonded or because it's specification simply hasn't changed (e.g. ANSI Common Lisp).
Of course, getting everyone to adopt a single programming language is a fool's errand. The best we can hope for is that one scientific journal will require that papers use one from an approved list.
reply
Yeah, unfortunately the majority of programmers are only interested in what gets them the best pay for their idiot tolerance level.
Not what is actually the experience of decades of the OG language designers in their work coordinating teams.
I think Forth would be the best pick for precise, formal specifications. BASIC is another solid choice too. C and its descendants until Go are a mess of assumptions and complex syntax and frankly I HATE OOP. I also hate repeating myself. What the heck is a .h file for? The linker? Why can't the lexer generate that???? (oh yeah it does in most languages).
Oh yeah, just look at what languages they use in agricultural research simulators. When I was 11 I got to spend some weeks in the lab of such nature and was the first and only time I ever worked with Lisp and I sorta just dodged the Forth but that's what my supervising researcher worked with. I think that sorta sums up what defines a formally robust language.
SIMPLE.
reply