...I don’t, as I burned out on programming years ago.
Interesting! In 1967 I was a college sophomore and had my first programming course (FORTRAN). Up until then I was heading for a career as an electrical engineer. However, I had quickly determined that I really wasn’t “grok"ing the concept of circuit design.
However, programming was a perfect match for the way my brain worked. I only had two programming courses my whole career (FORTRAN and COBOL) but taught myself a dozen different assembler languages and a similar number of high level languages. I memorized the detailed instruction set behavior for an equal number of CPU and CPU chips. I became a top expert in some of them and was teaching programming to space center and other government employees.
Then after about 30 years my brain just seemed to burn out. As new languages appeared I was just not interested. I kept asking “why?” “Why is this language necessary?” “Why do I need to spend all this effort to learn yet another subtly different language that could in most cases be done equally well in an existing language?”.
What’s more, I became disillusioned with the inevitable change in programming culture. Back in the old days one was presented with a problem and one came up with solutions that often required in depth analysis of the fundamentals of the problem that gave you an unparalleled breadth of understanding of all the issues surrounding this problem and challenged you with trying to extract the salient issues to reduce the solution so that it would fit in a computer of the day. A task not unlike fitting 4 pounds of Crisco into a 3 pound can.
The change to high level programming languages was intended to avoid having to rewrite program code to accomodate different CPU and operating system structures. What I saw happening was the dependence on CPUs and OSs was minimized but was replaced by dependence on browser and existing subroutine libraries. The subroutine libraries are by design, generalized machines. Their internal workings are a mystery because nobody actually writes documents to that level anymore. Error reporting is reduced to “Gack, I can’t handle this so I’ll ignore it!” The poor programmer is now reduced to playing “Whack-a-Mole” with the ever shifting, muddy foundation of the library, browser or OS du-jour. Spending 5% of his effort on the problem algorithm and 95% of his effort on making it work, and continue to work, in the morass of unstable environments.
UNIX was a great idea and it lies at the heart of all the biggest machines and networks but all the sprout up sub-species have been a distraction.
And the biggest cultural change I observed was the opening up of computer programming to the unwashed masses. Now, any idiot that can create a “Hello World” program in Basic thinks they’re a programmer. That might have been OK when computers were strictly one user, one thread and unnetworked. But I know from experience that writing a sophisticated program that behaves properly in multi-CPU, multi-thread, multi-user, virtual, networked environments requires thinking in 5 dimensions. Sure there are subroutine libraries that are supposed to make that task easy but as I noted above they are generalized solutions and don’t always exactly fit the job. AND you’re dependent on how well they were written and how prescient their creator was in being able to predict how it would be used. Ever wonder why unpredictable, mysterious “blue screens” and “hangs” happen? I know why, and it’s a dirty little secret the average programmer hasn’t got a clue to and cannot solve by themselves.
Yep, I burned out about a decade ago. Blew a fuse, melted a few billion brain cells. Now I just sit back and draw pretty pictures and let the minions at DAZ scurry around like ants under the magnifying glass of a 7 year old discovering the burning power of the sun! 8-o