Algorithms – Computer Programming’s Foundation

To many, computer programming seems an arcane topic, particularly when first introduced to it. Students learn variables, constants, conditionals, control flow, and numerous other topics. What is often lost – especially in these neophyte to guru in twelve weeks or less boot-camps – is what actually underlies all the arcane symbols comprising a computer language.*see note below

“Computer science is no more about computers than astronomy is about telescopes.”
Edsger W. Dijkstra

Edsger W. Dijkstra

Computer science, and by extension computer programming, is about problem solving. By itself, a computer is a particularly dumb conglomeration of silicon, wires, and aluminum. It does nothing but store and transport electrical charges from one location to another internally. It is not until a human, through his or her intellectual labors, harness the flow of the charges to perform some task that a computer does anything particularly interesting.  But getting computers to do something interesting require a systematic approach to performing tasks. Every step must be specified in detail and you must ensure not to overlook steps. You must devise a detailed procedure for the computer to follow to perform a task. That is computer programming’s essence: write procedures that dictate how a computer solves a  particular task.

So if computer science is not necessarily computer programming, then what exactly is computer science?  Of course, computer programming is an element of computer science, but what is computer science? The following video by “Art of the Problem” on YouTube explains.

Computer science is the science of computation. How to get computers to solve problems. The math behind pushing the boundaries of what computers can and cannot do is staggering. But this is the theoretical side to computer science. Consider the practical. A computer language is a set of instructions that humans can write to have a computer perform tasks. But as computers are dumb, those instructions must be explicit and accurate. Writing accurate instructions to perform a task require an accurate underlying algorithm.


Webster’s dictionary defines an algorithm as “a step-by-step procedure for solving a problem or accomplishing some end especially by a computer <cite here>.”  Here’s a humorous, yet insightful, video clip from “The Big Bang Theory” illustrating an algorithm.

A clip from “Terminator Two: Judgement Day” provides another more graphic, yet still humorous, example of an algorithm.

In the above clip, the Terminator uses one or more algorithms to obtain clothing. From visual input, he calculates the attributes of features such as height, weight, and foot size to determine the probability a person is a match. If not a match, he moves to the next person. If a match, he persuades the person to remove his or her clothes.

Computer programming is formalizing algorithms into a form usable by a computer. Algorithms are the solution to a task/problem. Code is the instructions performing the algorithm. The following video by “Art of the Problem” on YouTube introduces algorithms.

The above video assumes some knowledge of looping and conditional logic. However, if you are reading this as a supplement to “Think Java” then this post only assumes familiarity through chapter four of the book. This chapter introduces you to methods that do not return a value. You have not been introduced to methods that return values, looping, or conditional logic. So let’s consider another algorithm, purposely ignoring concepts not yet covered in “Think Java.”

Consider the task of boiling water. The following steps outline a basic algorithm.

  1. Get pot from cabinet.
  2. Take pot to the sink.
  3. Turn on water facet.
  4. Place water in pot.
  5. Turn off water facet.
  6. Place pot on stove.
  7. Turn on stove burner.
  8. Heat water until boiling.

The steps seem easy enough, but remember, computers are very dumb. These instructions are not explicit enough. For instance, consider the sub-steps involved in obtaining a pot from the cabinet.

  1. Walk over to cabinet.
  2. Open the cabinet.
  3. Reach into the cabinet.
  4. Grasp pot handle.
  5. Lift pot and take it out of the cabinet.

But of course, computers are too dumb to even understand these instructions. For instance, consider the sub-steps involved physically perform the first step, first sub-step.

  1. Turn body in direction of cabinet.
  2. Take alternating right and left steps until cabinet is reached.

But even still the computer is clueless, as these sub-sub-steps remain ambiguous. We could continue to break the steps down to the most fundamental level. For instance, instructions on how to move each muscle when walking to the cabinet.

Understanding muscular contractions is obviously taking our algorithm to a level too fundamental to be useful. Thankfully, algorithms build upon previously solved algorithms, which can be reused as a “black box.” But before considering algorithm reuse, first consider how to decompose an algorithm into more manageable sub-tasks through a process called functional decomposition.

Functional Decomposition

Functional decomposition is

Methods & Java Libraries

Societal Impact

There is a downside to our rush to formalize all problems into formal algorithms. We use computers to decide everything from product

* This post is particularly directed towards students that have completed chapters one through four of the book “Think Java.” by Allen B. Downey & Chris Mayfield.Note in this post I specifically avoid discussing repetition and conditional algorithms, preferring to keep this post simple. I also avoid discussing Object-Oriented analysis and design. Here, I focus on simple algorithms and functions. It is my personal belief, that understanding simple functional programming helps transition to Object-Oriented programming.

SSH, AWS, and The Cloud for the Layman

Cloud computing is the future of computing. Amazon Web Services (AWS) currently provides the most comprehensive cloud computing solution. To the layman, cloud computing probably sounds like just more computer jargon best left to nerds. But it’s not; all educated adults should understand cloud computing – at least on a non-technical level.

I have three servers – all three of which used to dramatically raise my monthly electricity bill – sitting in my closet. But now they sit there with the power button switched off. Instead I use AWS. AWS saves me money in electricity and is easily accessible over the Internet from anywhere. I couldn’t access the three servers sitting in my closet from anywhere outside my own personal network; the security risks were too great and the configuration details to big of a pain in the butt. Not so with AWS, now I can access my virtual server(s) residing on AWS from anywhere in the world.

In fact, I use AWS to provide a uniform computing environment for students in my introduction to programming class. On AWS I have a virtual server that is one of many residing on one of Amazon’s physical computers. I access that virtual server as if it was a physical computer via the command-line on my local computer. The communication between my personal computer and the server is accomplished using SSH over the Internet. I created accounts on the virtual server for each student. Each student uses ssh over the Internet from his or her personal computer to connect to the virtual server. The student then interacts with his or her account on the virtual server via the command-line. The actual programs the student writes, the software tools used to write and compile the programs, and the file system used to store the student’s work all reside on the virtual server residing in one of Amazon’s physical servers. This is but one use of cloud computing.

In this post I discuss the technologies behind using a virtual server on AWS. Let us explore each of the technologies involved in using a virtual server residing on AWS from a personal computer. This post is not a comprehensive technical tutorial on each technology, but rather, a layman’s introduction.

The Internet

How does one explain how the Internet functions to someone with little to no technical background?  This video does a good job explaining how the Internet works on a non-technical level.

Amazon’s servers are connected to the Internet.  This allows communicating with them. However, hackers lurk everywhere and so communication must be secure and so typically you use a secure shell (ssh) to communicate with the server from your personal computer. But understanding ssh requires understanding the command-line.


If you already know what the command-line is and why you might use it, skip to the next section. Otherwise, you should watch the following video before continuing. Before the fancy graphics on your computer were standard, there was the command-line. Users interacted with computers via a simple text interface. Even today, that simple text interface is how most users communicate with a remote server.  Let’s explore the history of the user interface so you have a better understanding of the command-line.

Secure Shell

Just as a browser on your computer can connect to a remote computer through the Internet and display web pages, your computer can also connect to a remote server through the Internet using Secure Shell (SSH) and display text input and output. This video explains how SSH works on a non-technical level.

Cloud Computing

Okay, but what about Amazon Web Services, Cloud Computing, and all that nonsense? Why the fuss?  Let’s take a step back and get a 5 minute non-technical history of computing and cloud computing narrated by none other than Stephen Fry.

Now, remote physical servers, accessed from anywhere in the world is nothing new. Moreover, having space for your website – for example housing your site on a provider such as GoDaddy – is not new. But cloud computing is more than simply providing space on a server to house you webpage. Cloud computing gives you unprecedented access to that remote server’s computing power. Moreover, it provides that access securely and ensures your work doesn’t impact other users also using that server. To understand how, you must understand something called virtualization. Virtualization provided over the Internet is what allows ubiquitous computing power to everyone…well, everyone that can pay for it (but that’s a different blog post).


A virtual machine is a “pretend computer” that runs on a physical computer. For instance, a Mac user might use VMware Fusion to run Windows or Linux on his or her physical Macintosh computer. That same technology allows a physical server to run multiple “pretend servers” on its hardware.

So a few years ago, some enterprising employee noticed that much of Amazon’s vast computing power went unused. Yet Amazon was still paying for that unused computing power. Why not rent that power out? Enter the virtual machine. Amazon would allow folks to create their own virtual machines on Amazon’s physical computers and pay only for the computing power actually used – essentially “renting” the computer. This simple idea has spawned an entirely new business for Amazon that is arguably poised to make more profit than even their online marketplace. Entire corporations can throw away their physical servers and instead “rent” space on Amazon’s physical servers that run virtual machines.

Amazon Web Services

Here is a video explaining Amazon Web Services for non-technical folks. Keep in mind, behind the strange lingo and explanation, fundamentally you can explain cloud computing quite simply. As already discussed earlier, instead of owning the computer you run software, you rent space on Amazon’s server and it runs the software. But you are not simply renting space, you are running your own computer on Amazon’s computer using virtualization.

And another video by Amazon on cloud computing using AWS.


Cloud computing allows a lowly adjunct instructor at a community college (me!) to offer his students the same computing power that an MIT or Carnegie Mellon could offer its students. In less than an hour, while sitting in my study on a Sunday night, I was able to accomplish what would have taken days several years ago.

And rather than relying upon the college to house and maintain costly physical servers on campus, I will pay less than five dollars for enough computing power for an entire semester of approximately twenty-five students writing and compiling homework assignments. That is powerful stuff right there. It’s game changing.

Why Study Information Technology?

The other day at Barnes and Nobles bookstore my 15 year old daughter, in a hushed whisper, told me “dad, that guy behind us, he was applying for a job.” I replied “so, what’s so strange about that?” She looked at me and said “he told them he just got a psychology degree and couldn’t find a job.”

It reminded me of when I came to Washington DC, fresh with an Anthropology degree in hand and expectations that surely I could find a job somewhere. I did, first as the graveyard shift clerk at a 7-Eleven in Manassas Virginia and second as a banquet setup supervisor at a Marriott near Dulles airport. Even though I had defied the odds and escaped my humble trailer-park upbringing and went to college, I still faced no prospects of a good job. My work-life was one of coffee carafes, pastry trays, and disgruntled meeting attendees.


Once, I was late to my banquet setup job, probably by no more than ten minutes or so – but it  might as well of been an hour as far as the hotel manager was concerned. He chastised me, threatened to fire me, and told me the most important quality in my position was punctuality. And to be fair, he was probably right. But still…it was then that I decided to study computer programming at night.

I am so thankful for that fateful decision for I stumbled upon a career where you think for a living, enjoy (well mostly enjoy) what you do, and are never want for a job (usually).  And if you are good, nobody is chastising you for arriving ten minutes late to work. And, after two or three years experience, the dilemma is never one of if you can find a job, but rather, if it’s a job you want and at the rate you want. Sure, you might stumble upon the odd story of the IT professional who works at Walmart because he or she cannot find gainful employment. But I assure you, dig a little deeper, and you will find extenuating circumstances. Perhaps they stopped learning and stagnated, perhaps they were never very good to begin with. IT changes rapidly, people that find a job in one technology and then fail to learn new technologies rapidly find themselves outdated and expendable. Simply put, if you are not willing to devote time to life-long learning, than IT is not for you. But if you devote a modicum of time to stay current in one or more technologies, you will find a job – usually within a couple months. And even more importantly, if you want a raise and your current employer won’t fulfill your request, you can easily find the rate you desire elsewhere.


It is hard to impress upon folks just how lucrative IT is as a career. While every career is taking massive hits, and the educated, well-paid, career professional is rapidly becoming extinct, there is one career choice that is growing dramatically – anything related to Information Technology. So how to explain this to the average person, someone that has worked in a “job” his or her whole life? How do you explain the difference IT can make?

Information Technology offers you a career rather than a job.  Unsure of the difference?  Chris Rock explains it best.

Computer Science Career Prospects

Before I discuss the gloomy prospects for most humans on earth in the twenty-first century, let’s start by exploring a particulary lucrative career, computer programming.  Then, as we review the grim prospects, remember that we will readdress this career and the takeaway you should have from this post.

Not all jobs in computers are as programmers. Networks require administering. Software must be tested. Networks, hardware, and software must all be secured. There is a spectrum of tasks supporting the software revolution, so if you don’t like programming it doesn’t mean you cannot go into Information Technology.

Technological Unemployment

Now for the doom and gloom. The career prospects are grim for most humans. If you have a college degree and find yourself working at Starbucks, you are not alone. You did not throw away your life on a “worthless degree.”  In the past, there was a large demand for well-educated college graduates regardless of major. But now, things are changing, it is harder to find a job, and a college degree just isn’t the employment guarantee it once was. If even graduates in engineering have difficulty finding entry-level jobs, how are graduates in a subject like Classical Civilization going to find jobs?

I’m not trying to make this political, but clearly the demand for human labor – both skilled and unskilled – is decreasing while the population is increasing.  Blame the Democrats, Republicans, or even blame the Easter Bunny, it doesn’t matter who’s at fault…as computers continue to increase efficiency the demand for labor will continue to decrease. And you can take that statement as fact.

Superstars and Jobs

Maybe you are a snowflake. Maybe you are the next Justin Bieber, The Weekend, or Skrillex to start from humble beginnings on YouTube to international stardom. Perhaps, but consider this, even the music industry has seen many jobs simply vanish. Gone are the days of the well-paid backup musician, the rock band, or the symphony playing movie scores. Instead you have the single DJ/Producer or electronic musician. And the one or two solo artists who manage to rise to the top and reap all the rewards are the exception not the norm.

But maybe you are going to write the next Angry Birds and sell it on the App Store. Or perhaps, just perhaps, you will be the next YouTube image-consultant star with your own podcast like Dante Nero or Alpha M or Bradley Martyn. You will tape yourself buying a Ferrari and post it for the world to see, thus ensuring YouTube stardom. But for every successful YouTube star there are countless more that never make it. Certainly I agree a lofty goal like YouTube stardom is by all means impressive – go for it – but just be certain to have a backup plan. Or better yet, learn how to make your backup plan part of your dream (see the link to a Harvard Business Review article below).

Here’s a good video explaining the economics behind the Winner Take All Economy. But it’s not that difficult to noodle it through: if 200 years ago you had to physically go to a theatre to watch actors perform while today you can watch actors perform from the comfort of your living room, then the demand for actors today must be less than it was then, as not every town needs its own theatre and acting troop.

Wow, so that’s pretty grim. Not only are computers taking our jobs, mass-communication means that consumers can select the best and leave the second-best to mop floors. So perhaps a strategy of being the CEO, the star, or the professional athlete isn’t the wisest career plan.

Working for a Living

Finally, we need to talk about work. Old fashioned American hard work. Perhaps, just perhaps, you were misled by our recent popular culture regarding jobs/work.  Sure Steve Jobs tells you to “follow your passion,” but is it practical?  Moreover, earlier I presented a video clip of Chris Rock extolling the virtues of a career compared to a job. But he was talking about unskilled labor. Skilled labor isn’t necessarily a j.o.b. that makes one loath to arise in the morning. Many white-collar careers are also a j.o.b. to the person in the position. In America, over the last few decades we started equating anything involving labor as a job and anything pushing paper as a career. I find that strange, as I feel a managerial position where I “plan stuff” all day would be about as appealing as scraping shrimp off plates at Red Lobster.

IT is hard work. I find it more akin to the blue-collar work of yesteryear than to white-collar paper pushing. In fact, industry hype aside, you will find managers have about as much respect for IT folk as they do a kid working a drive-through window. Maybe IT wasn’t your passion and how you envisioned your life…but it’s an opportunity. Isn’t it possible to shape your passion to the situation you find yourself in? Listen to Mike Rowe talk about passion and opportunity.

There are jobs, albeit not enough to counteract technological unemployment. Somebody has to work the computerized mechanical monsters, albeit, instead of fifty people only one person might be needed. If you can be that one person…

If you are in IT, and you resolve yourself to constant, lifelong learning, then you will be needed in the new economy. Still not convinced?

The Age of Abundance

Here is a video explaining the fourth industrial revolution. While watching it, take particular note to the people in the video. Without exception, regardless of the person’s career, he or she incorporates IT into their career.

Information Technology Career

A perfect storm is brewing. Computers are rapidly replacing many of the traditional jobs. And not just menial jobs, jobs that traditionally required substantial education. At the same time, our culture denigrates hard work culturally. IT is more akin to the blue collar worker of yesterday than the stereotypical manager of yesterday. System administrators, cybersecurity analysts, programmers, all do something beyond go to meetings, schedule things, and drink coffee.

But forget all the high-faluting nonsense, let us keep it simple.

  • Careers deemed “menial” or “trade” or “blue collar” somehow became undesirable over the past several decades, even rewarding ones that require considerable skill and pay handsomely.
  • In the United States, Information Technology was taught as a science rather than a learnable skill, leading to large sectors of our population believing it was a career beyond his or her reach.
  • Computers are taking jobs, even “smart” jobs, at an alarming rate.
  • Hundreds of thousands of Information Technology jobs will go unfilled.
  • If you are willing to become skilled in this career – all of the above means you have a job, a good job, even if the economy does go south for most other college graduates.

It really doesn’t matter what you consider the source of the world’s current woes. It also doesn’t matter what you consider the solution to those woes. Rest assured, information technology will be a part of the solution (unless we bomb ourselves back to the stone-age, in which case discount my previous pontification). If you even suspect you might like a career in computing, you owe it to yourself to explore all facets of the industry. Explore computer programming, system administration, testing, cyber-security analysis, and other aspects of the career.

You might not be the next Steve Jobs, Mark Zuckerberg, or IT billionaire, but you will have a six-figure salary for a relatively low  upfront educational investment.  And if you dedicate yourself to lifelong learning, or find that coding is your passion? Well, the sky’s the limit.

As an aside, I’m getting up there in years…it took me a long time to learn the lesson in this Harvard Business Review article. Although the article is directed at those in their midlife, it is equally applicable to those just starting out and trying to balance their dreams with the realities of his or her current situation.  Yes, it requires reading (gasp !) rather than watching a YouTube video…but there is so much truth in this article. It is exactly what Mike Rowe is saying on a higher level. In fact, I plan on sending the link to him and/or his Facebook page.

The Existential Necessity of Midlife Change