ID:151795
 
How did it come to be?
Imagien, trying to code, when the computer cant read what an if statement is, what a while loop is, how to even read the code itself!
(If your a fan of creationism, god did it, of course!)
How could it of been done, i just dont understand!
Holes.
In response to Kaiochao
http://xkcd.com/505/

George Gough

[Edit]
Can't forget my trusty signature huh?
Just as humans weren't plopped down on this planet to start with, and just as the first music wasn't Beethoven's Ninth Symphony, the very first bits of computer code didn't involve a programmer plodding away, writing out if statements and while loops.
In response to KodeNerd
Still forgot it!
In response to KodeNerd
KodeNerd wrote:
http://xkcd.com/505/

George Gough

[Edit]
Can't forget my trusty signature huh?

It says, with the right set of rules and regulations


But how does the computer know what these rules are?!
In response to Rushnut
You make them.

George Gough
In response to KodeNerd
How do you make a rule, the computer wouldnt know what to do with the rule once you gave it the rule.
In response to Rushnut
It involves boolean statements, binary, and transistors. Given the question, explaining exactly how it works is probably going to confuse you, but note that the first programs were written in binary and hexadecimal, nothing more.
In response to Popisfizzy
Popisfizzy wrote:
...the very first bits of computer code didn't involve a programmer plodding away, writing out if statements and while loops.

Gears, tape, and punch cards.
(Only slightly) Fun fact: the first programmer was a gurrrl.
In response to TheMonkeyDidIt
I'm well aware of both, but didn't include them for a specific reason: The analytical engine was never actually constructed. Ada Lovelace was the first programmer, but her programs were never executed.
In response to Popisfizzy
I figured you would be, Pop. I really wasn't playing quiz show, believe it or not. Just though I'd bring a little history into the thread considering the post-b***-hit nature of the original question. :)
All computer CPUs are setup to perform different operations when different instructions are presented to them, which is the machine language of the computer. The first programmers worked with machine language directly.

Eventually programmers discovered that they could write a program in machine code that could interpret text and convert it to machine language. That text was in the form of assembly language, which varies from computer to computer because different computers have different forms of machine language. Assembly is really just a more readable form of machine language, but it's considered a second-generation language because it's a step up from machine code.

Using assembly, programmers discovered they could parse more complex constructs that were easier to read, and they could design languages that had true structure. C is one of the best examples of an early third-generation language. Code written in C is converted into the machine code for whatever system it is compiled for.

In the beginning, for and while loops didn't exist--such things are really just constructs within languages like C, Pascal, BASIC, etc. At the raw end of things, pure machine code, the only way to do looping is with a jump instruction (i.e., a goto) or a conditional jump. A C compiler knows that when you use a while() loop, internally it should look like this:
(start of loop)
test condition
if false, goto end of loop
do stuff inside loop
goto start of loop
(end of loop)

So C is just taking a format you can easily read and converting it to one that the computer understands. The very first programmers, though, didn't have that luxury and had to do it all at the very low computer level.

Lummox JR
In response to Lummox JR
Lummox JR wrote:
All computer CPUs are setup to perform different operations when different instructions are presented to them, which is the machine language of the computer. The first programmers worked with machine language directly.

Lummox JR


But here is where im lost!
How could the computer understand this?! How could a big box of microchips and drivers, with absoloutly NOTHING on them, understand, ANYTHING?


In response to Rushnut
Rushnut wrote:
But here is where im lost!
How could the computer understand this?! How could a big box of microchips and drivers, with absoloutly NOTHING on them, understand, ANYTHING?

Computers don't understand, as such; they just do what they're told. A CPU knows that when a "high" voltage is applied to some input pins and a "low" is applied to others, it has been given a certain instruction based on the pattern of high and low values. That instruction is basically run through a massive number of transistors that signal other parts of the CPU to hold a given voltage or release it onto outgoing wires into still other parts of the CPU. The computer is controlled by an internal clock that tells it when to execute the next step of its instructions. It is essentially just a complex step up from a mechanical clock.

Lummox JR
In response to Lummox JR
A big clock...

I see now, yet the first code must of been (Wasent it B? or V... i forget) incredibly messy
In response to Rushnut
Consider that I've built a see-saw out of wood. On one end of the see-saw is a cup into which I can drop marbles.

If I drop one marble into the cup then that end of the see-saw will go down, and the other end will come up. As that end swings up, it may hit a marble on a ledge somewhere, causing that marble to fall into a cup somewhere else. This only happens if there is a marble on that ledge, though. If there isn't a marble on that ledge then nothing happens. As the other end continues to fall down under the weight of the marble, eventually the marble is dumped out and falls to the floor (or perhaps into another cup). The see-saw then returns to its original position.

If I drop two marbles into the cup, then the arm swings faster, and it moves further before both marbles are dropped. Now the other arm not only pushes off any marbles sitting on the first ledge, but it may even go higher and push off marbles sitting on the ledge above it. These marbles fall into other cups, or fall onto ledges to wait for an arm to swing around and hit them.

This may seem extemely contrived, but wooden 'computers' have actually been built using wooden dowels and marbles. I can't seem to find the correct Google search to turn up the images of said computers, though.

The next step up from wood is the gears and such that have been mentioned before. After those came mechanical relays, vacuum tubes, and all sorts of interesting analog mechanical devices. Transistors, what computers are based on now, are no different, fundamentally, from those see-saws and marbles, though. The important difference is that transistors are really really small, small enough that we can fit billions of them into a chip - making a modern computer with billions of marbles and dowels just isn't logistically possible.

But transistors really are the same thing. A transistor is a mechanical device which, in concert with other transistor and assorted electrical devices, performs mechanical calculation. Instead of using levers and marbles, they just use electrons and really tiny bits of silicone.

Oh. And Magic/More Magic.
In response to Rushnut
How a computer works isn't programming, it's Electrical Engineering. Microchips and circuit boards are designed/engineered to behave in a certain way. How electrical engineering works is something that that can't be easily explained on this forum.

If you want your mind boggled, just try and understand a very basic ALU: http://en.wikipedia.org/wiki/Arithmetic_logic_unit

Trust me, it's easier just to believe in magic.
In response to IainPeregrine
I know that it isn't the computer that you were looking for, but you reminded me of this.

First Computer
In response to Rushnut
Or like I said, I recall an early form of computers including cards with holes punched into them. But that's a little more recent than wood seesaws. It would be a while until typed code would be invented.
Page: 1 2