Hardware and Software Theory


What is programming?

A software program is a set of instructions issued to a computer via source code. Source code is data that usually resembles a text document which is typed in a specific programming language that is somewhere between a language that computers can process efficiently and humans can understand. The task of creating source code is known as programming.

Human readable source code on the left and machine code on the right. Although source code does not make sense at first glance to someone new to programming, when compared to it’s equivalent in machine code it definitely looks more appealing.

 

How does programming relate to hardware and software?

Before we get into the process of creating software programs, lets first examine the relationship between a software program and a computer.

One of the fundamental functions of a computer is it’s ability to control electrical energy and ultimately transform that energy into another form of energy such as light, sound or motion. We use software to communicate with a computer, which will determine the various paths that the electrical energy will take, in order to achieve the goal we have stipulated through programming.

Computers are designed to react to electrical energy, this energy can be of a high or low voltage and forms the basic premise of all the complex interactions that exists between humans and computers. These high and low voltages of electrical energy are relayed within the circuitry of a computer in very short intervals that allow us to distinguish one from the other and will eventually be sequenced together to form a protocol for communicating with computers.

This protocol of communication we represent with two numerical digits a 1 and a 0, which to a computer equates to a high and a low voltage. We refer to this form of representation as Binary Code. Of course, this representation means nothing to a computer as it is only reacting to the electrical energy that is being relayed within it’s circuitry. But to us using a 1 and 0 as a form of representation creates a means for every simple to complex interaction we have with computers through software.

By combining various sequences of 1’s and 0’s (and ultimately sequences of high and low voltages) followed by more sequences of 1’s and 0’s arranged in similar or different patterns the effect of a continuous flow of energy is created that results in light or sound or one of the many other capabilities of a computer. The fact that this flow of electrical energy is broken up into smaller chunks means that each chunk can be made up of a different sequence of 1’s and 0’s which could be used to create the impression of a change or variation which could finally be perceived as an image moving in an animation or the diaphragm of a speaker vibrating at different amplitudes which, for example, we ultimately perceive as the sound of our favorite track.
Although useful, representing data in this way can become quite cumbersome and error prone. If developing software meant having to learn endless sequences of binary code, becoming a programmer would be a very daunting task. This is, however, not the case as we can use a 1 and a 0 to represent a high and a low voltage we can use a sequence of 1’s and 0’s to represent more complex ideas such as a larger number like 155 which can be translated into binary as 10011011. This form of representation can furthermore be extended to include typographic characters, which in themselves can be strung together in sequences to create words, which can in themselves be strung together to issue commands to a computer through programming. We refer to this process as abstraction.

Programming in it’s simplest form is the process of creating and modifying a series of 1’s and 0’s that influences the path of the electrical current, by means of abstraction.
At some point everything that a computer processes (including our own interactions with it) has to exist in binary code representations. However, it’s important to remember that binary code is simply just a human representation, for determining a sequence of high and low voltages, that we use to make the complexities that occur within a computer more understandable to us.
As I mentioned earlier, a computer does not really care about what form of representation we use, however this process of representing data certainly does make it a lot easier to communicate with computers.


Leave a Reply

Your email address will not be published. Required fields are marked *