TOOLS FOR INFORMATION SYSTEMS – Part I
I asked Chet a lot of things on the topic, probably only of interest to a small handful of folks.
One thing I did not look at was the rapidly growing use of large language model AI to actually write and improve code. How much of current practices will be obsoleted by this is a few years, as the AI gets more and more competent? I have not begun to explore that. I did not bother to ask Chet, as I expect that the developments in that area are so recent, that there will be little or nothing in the large language model training data.
TOOLS FOR INFORMATION SYSTEMS – Part I
PREFACE
I was involved with information system (IS) work for decades, until I retired maybe 15 years ago. I cannot recognize the field now; it has changed so much. Anyway, I decided to see what ChatGPT (Chet) had to say about tools currently available for IS today. It boggles the mind!
I started off when we used Hollerith punch cards, and I think we had to punch the holes with a mallet and steel punch, but I could be mistaken. I am old, and my memory often fails me.
One thing I did not look at was the rapidly growing use of large language model AI to actually write and improve code. How much of current practices will be obsoleted by this is a few years, as the AI gets more and more competent? I have not begun to explore that. I did not bother to ask Chet, as I expect that the developments in that area are so recent, that there will be little or nothing in the large language model training data.
I started learning FORTRAN programming in about 1967 or so. As far as I can remember, the computers back then were coal-fired, but I could be wrong about that. In any case, I am sure that my cell phone (cheap bugger that I am, pretty low end) has far more processing power than the first computer that I learned on.
There was no operating system, and you fed in the input on Hollerith Cards (punched cards) and got the output back on a line printer. I can’t really remember this well, but I think that there was a typewriter keyboard attached to a card feeder and hole puncher. For reading the cards, there was a reading device that could sense the holes. The holes coded for a character set. There was no operating system, and I suspect that the machine bootstrap program had to be loaded, possibly by toggle switches (I know that I did have to do that with a lab control computer a decade later). The compiler was fed in on a deck of cards, and that deck was followed by the FORTRAN source code. I suppose that the output from compilation was sent to the card punch, but as students, we were not allowed to touch the machine, so I am not sure. In any case, after compilation, the object code was executed to produce the printed output.
Did I mention how we put the holes in the cards? Maybe? That was a long time ago, and I may have misremembered stuff. It could be that we had to punch the Hollerith cards with a steel punch and mallet. Again, that was a long time ago.
I did not touch that computer stuff for another decade. Finishing up my undergraduate degree in experimental psychology, I found that I was a half-credit short, so I enrolled in a FORTRAN II course. It came naturally to me, so I became a teaching assistant.
When I got to graduate school, I took an advanced course in statistics; that is it was advanced for psych students. It would at best have been a second or third year course for an engineering student, let alone a mathematics student.
I could not understand the statistics material until I bought a book on matrix algebra and learned the basics of that. For some reason, I have found most branches of mathematics easier to grasp than probability and statistics.
In any case, I found that I could make money doing statistical data analysis for people, and drifted away from psychology. I somehow found my way to a job doing actual programming, and that became my work focus for quite a few decades.
The software we had when I started, and even when I was finishing my career, was much more limited than today’s offerings. I asked Chet to provide me with tools for various job roles within information systems departments. My career was long enough that I had more than a few of these. Frequently as the methods guru (aka terrorist) I was able to get an overview of many roles that I have never performed.
The variety in the types of tools listed below by Chet is astonishing. I could hardly convince my managers to invest a couple of hundred bucks on a few very simple tools, and occasionally bought them from my own purse. Now, the possibilities are both astonishing and perplexing. Clearly there are costs and benefits, threats and opportunities, associated with all software suites deployed. Such a discussion could get pretty involved; I do know that territory though. I may have well written about it in the past, but if so, such text is lost to time. Instead, I had Chet outline the issues; see the next topic. Yeah, that is about right Chet.
TRUE COST AND BENEFITS, THREATS AND OPPORTUNITIES, FULL LIFE CYCLE ANALYSIS, FOR INFORMATION SYSTEMS SOFTWARE
1.
True Cost and Benefits: True cost and benefits analysis involves evaluating the complete costs and benefits associated with the development, implementation, and maintenance of information systems software throughout its lifecycle. It goes beyond just the initial investment and includes factors such as ongoing support, upgrades, training, and operational costs. By considering the full cost and benefits, organizations can make informed decisions about the economic feasibility and potential returns of implementing the software.
2. Threats and Opportunities Analysis: Threats and opportunities analysis involves assessing the potential risks and benefits associated with the implementation of information systems software. This analysis helps identify potential threats or challenges that may hinder the success of the software, such as security vulnerabilities, compatibility issues, or resistance to change. Simultaneously, it also identifies potential opportunities for improving efficiency, effectiveness, and innovation through the software's implementation.
3. Full Life Cycle Analysis: Full life cycle analysis entails evaluating the entire life cycle of information systems software, from conception to retirement. It involves assessing the software's development, deployment, operation, maintenance, and eventual retirement or replacement. This analysis helps organizations understand the costs, benefits, and risks associated with each phase of the software's life cycle, enabling better decision-making throughout the software's lifespan.
The purpose of these analyses is to provide a holistic perspective on the economic, operational, and strategic implications of implementing and managing information systems software. They help organizations make informed decisions about software investments, mitigate risks, capitalize on opportunities, and optimize the overall value derived from the software over its lifetime.

Very interesting, Mike.
Even though I was glued to "Tomorrow's World" as a kid, I don't think I could imagine back then in the 70s what we have to play with today. I just played with Dall E 2 (managed to get 2 pieces done before running out of the free credits - check out my substack for the results). Here's the link to Dall E 2: https://openai.com/product/dall-e-2. To create my images I uploaded an image (famous painting) and cropped it to a square in Dall E. I then added to the painting, one block at a time. Basically, you add a block (making sure you have overlap of the original image so it understands the style to copy) and then give it a text prompt i.e. UFOs from the sky, or more sky, or alien etc.. using my example.
I went to one of the new comprehensive schools in the 70s. Looking back, although I had my issues, I got to work with some good teachers. In the science department at my school we have 4 PhDs. Can you imagine a state school in England like that today? Anyway, we had a computer where you fed it punch cards. Quite a few kids got into computing that way.