ID:192252
 
How did people invent computer languages? Infact..how did they get the first computers to do what they did...its kinda strange to me.... inventing a computer

BTW, the guy who invented computers was gay.... (not in a bad way)

I think its about return 1, return 0 or something like that..its just been boggling my mind for a while...thanks
Sariat wrote:
BTW, the guy who invented computers was gay.... (not in a bad way)

I don't think you can credit any one person with inventing computers. It's a wide field and it took a lot of input (pun not intended) to get the first computers running.

And please, don't pull an SSGX.

Lummox JR
Just remember that computers didnt just suddenly spring into thier current state of complexity ^_^ Computers evolved from simple mechanical devices that at first could do nothing but add. Then as time wore on they managed (still using that addition functionality) to get the machine to multiply, divide, and subtract ^^

Then electronics hopped onto the scene and suddenly we can add other functionality to it that we couldnt to the mechanical version ^_^

The novel 'War and Peice' starts with a single word ^_~

Of course you can barely make out that word in my version ;_; If someone ever tries to get you to take a pocket version of it, laugh at them @.@ *LOL* My printer cant even type that small ;_;

El
I doubt there was a novel called 'War and Peice' written... unless it was by the famous and radically misspelled author Yotslot...

History lesson time boys and girls - now gather 'round the campfire...

If you want to be historically accurate, a 'computer' was first used as a term to identify people who 'computed' things - usually tabulating charts of numbers for accounting purposes, census, whatnot...

Quoting from the Charles Babbage Institute: "The calculating engines of English mathematician Charles Babbage (1791-1871) are among the most celebrated icons in the prehistory of computing. Babbage’s Difference Engine No.1 was the first successful automatic calculator and remains one of the finest examples of precision engineering of the time. Babbage is sometimes referred to as "father of computing.

In 1821 Babbage invented the Difference Engine to compile mathematical tables. On completing it in 1832, he conceived the idea of a better machine that could perform not just one mathematical task but any kind of calculation. This was the Analytical Engine (1856), which was intended as a general symbol manipulator, and had some of the characteristics of today’s computers.

Unfortunately, little remains of Babbage's prototype computing machines. Critical tolerances required by his machines exceeded the level of technology available at the time. And, though Babbage’s work was formally recognized by respected scientific institutions, the British government suspended funding for his Difference Engine in 1832, and after an agonizing waiting period, ended the project in 1842. There remain only fragments of Babbage's prototype Difference Engine, and though he devoted most of his time and large fortune towards construction of his Analytical Engine after 1856, he never succeeded in completing any of his several designs for it. George Scheutz, a Swedish printer, successfully constructed a machine based on the designs for Babbage's Difference Engine in 1854. This machine printed mathematical, astronomical and actuarial tables with unprecedented accuracy, and was used by the British and American governments. Though Babbage's work was continued by his son, Henry Prevost Babbage, after his death in 1871, the Analytical Engine was never successfully completed, and ran only a few "progams" with embarrassingly obvious errors..." (http://www.cbi.umn.edu/exhibits/cb.html)

"The Atanasoff-Berry Computer was the first electronic digital computer.

Built in 1937-1942 at Iowa State University by John V. Atanasoff and Clifford Berry, it introduced the ideas of binary arithmetic, regenerative memory, and logic circuits. These ideas were communicated from Atanasoff to Mauchly, who used them in the design of the better-known ENIAC built several years later.

The original ABC was dismantled decades ago. Ames Laboratory, using private funding, is building a working replica of this historically important invention..." (http://www.scl.ameslab.gov/ABC/)

"On June 5, 1943, the Ordnance Department (who needed a machine to compute ballistic trajectories for new weaponry to be used when the US entered the war in Europe -digitalmouse) signed a new contract with the Moore School of Electrical Engineering to research, design, and build an electronic numerical integrator and computer -- ENIAC. It was to be supervised by Professor Brainard, with Dr. Eckert as chief engineer, and Dr. Mauchly as principal consultant.

The machine designed by Drs. Eckert and Mauchly was a monstrosity. When it was finished, the ENIAC filled an entire room, weighed thirty tons, and consumed two hundred kilowatts of power. It generated so much heat that it had to be placed in one of the few rooms at the University with a forced air cooling system. Vacuum tubes, over 19,000 of them, were the principal elements in the computer's circuitry. It also had fifteen hundred relays and hundreds of thousands of resistors, capacitors, and inductors. All of this electronics were held in forty-two panels nine feet tall, two feet wide, and one foot thick. They were arranged in a "U" shape, with three panels on wheels so they could be moved around. An IBM card reader and card punch were used respectively for input and output.

The ENIAC was programmed by wiring cable connections and setting three thousand switches on the function tables. This had to be done for every problem and made using the machine very tedious. However, the speed of the computation made up for this. Ballistic trajectories can take someone with a hand calculator twenty hours to compute. The Bush differential analyzer (a mechanical analogue machine invented in 1925 at M.I.T -digitalmouse) reduced this time down to fifteen minutes. The ENIAC could do it in thirty seconds.

Construction of the ENIAC was completed in the fall of 1945. On February 15, 1946, the Electronic Numerical Integrator and Computer was dedicated by the University of Pennsylvania. Its very first application was to solve atomic energy problems for the Manhattan Project. During its first year at the University of Pennsylvania, it computed ballistic trajectories for the Ordnance Department, as well as problems for weather prediction, cosmic ray studies in astronomy, random number studies, and designing wind tunnels.

In January of 1947 the ENIAC began shipment to Aberdeen, and by August of that year it was put into operation for trajectory tables. The technical staff at the Ballistic Research Laboratory had a difficult time with the large assembly of electronics, mostly due to the difficulty of programming it. But in 1948, by the advice of Dr. John von Neumann (later credited with the concept now known as 'von Neumann machines': self replicating machines -digitamouse), alterations were made that turned it into a stored-program computer. This change greatly reduced the amount of manual re-wiring that had to be done for each program.

The ENIAC was a revolutionary machine. During its development, the design had to be frozen to ensure that the working prototype could be completed in time for the Ordnance Department to get the trajectory tables finished. But the team who worked on it quickly found many areas in which it could be improved. These were later manifested in the successor to the ENIAC, the Electronic Discrete Variable Computer, or EDVAC. Other computers followed, including the Ordnance Variable Automatic Computer (ORDVAC), the Standards Automatic Computer (SEAC), and the UNIVAC -- the machine built by Dr. Eckert and Dr. Mauchly for the U.S. Census Borough when they left the Moore School to start their own business." (http://ei.cs.vt.edu/~history/ENIAC.Richey.HTML)

Of course with the invention of the Integrated Circuit (IC) in 1958 at Texas Instruments by Jack Kilby. "At the time, nobody was making capacitors or resistors out of semiconductors. If it could be done then the entire circuit could be built out of a single crystal -- making it smaller and much easier to produce. Kilby's boss liked the idea, and told him to get to work. By September 12, Kilby had built a working model, and on February 6, Texas Instruments filed a patent. Their first "Solid Circuit" the size of a pencil point, was shown off for the first time in March.

But over in California, another man had similar ideas. In January of 1959, Robert Noyce was working at the small Fairchild Semiconductor startup company. He also realized a whole circuit could be made on a single chip. While Kilby had hammered out the details of making individual components, Noyce thought of a much better way to connect the parts. That spring, Fairchild began a push to build what they called "unitary circuits" and they also applied for a patent on the idea. Knowing that TI had already filed a patent on something similar, Fairchild wrote out a highly detailed application, hoping that it wouldn't infringe on TI 's similar device.

All that detail paid off. On April 25, 1961, the patent office awarded the first patent for an integrated circuit to Robert Noyce while Kilby's application was still being analyzed. Today, both men are acknowledged as having independently conceived of the idea." (http://www.pbs.org/transistor/background1/events/icinv.html)

Such a revolution in circuit design led to small and faster computers which needed stronger and more powerful languages to take advantage of them. Early 'languages' were not much more than a series of binary switches (usually in pairs of 4, 8, or 16) that represented a single instruction per switch combination - tedious and time consuming, but doable...

Programming Languages grew into their own rich and varied history. I won't repeat it here, but there is a good timeline of the history of programming languages, from 1946-1996 at this link in BYTE Magazine: http://www.byte.com/art/9509/sec7/art19.htm

Some notable tidbits:

"Grace Murray Hopper, working in a temporary World War I building at Harvard University on the Mark II computer, found the first computer bug beaten to death in the jaws of a relay. She glued it into the logbook of the computer and thereafter when the machine stops (frequently) they tell Howard Aiken that they are "debugging" the computer. The very first bug still exists in the National Museum of American History of the Smithsonian Institution. Edison had used the word bug and the concept of debugging previously but this was probably the first verification that the concept applied to computers." (http://www.lewhill.com/firstcomputerbug.html)

"The Small-Scale Experimental Machine, known as SSEM, or the "Baby", was designed and built at the University of Manchester, and made its first successful run of a program on June 21st 1948. It was the first machine that had all the components now classically regarded as characteristic of the basic computer. Most importantly it was the first computer that could store not only data but any (short!) user program in electronic memory and process it at electronic speed." (http://www.computer50.org/)

"First conceived and implemented by Urban Mueller on the Amiga computer system, Brainf*ck is a computer programming language with only eight commands and a really tiny compiler." (http://www.catseye.mb.ca/esoteric/bf/)

BTW, the guy who invented computers was gay.... (not in a bad way)

Alan Turing didn't invent computers per se, but he did invent Turing Machines which are theoretical representations of computers.

-AbyssDragon
In response to Lummox JR
Lummox JR wrote:
Sariat wrote:
BTW, the guy who invented computers was gay.... (not in a bad way)

And please, don't pull an SSGX.

Lummox JR

Heh...
In response to SuperSaiyanGokuX
SuperSaiyanGokuX wrote:
Lummox JR wrote:
Sariat wrote:
BTW, the guy who invented computers was gay.... (not in a bad way)

And please, don't pull an SSGX.

Lummox JR

Heh...


Yeah.....Heh......
In response to AbyssDragon
And one of Turing's big claim to fame (and it was not being closet homosexual) was the methods and techniques used to determine machine intellegence. He set up tests where you would type on a terminal, having a conversation with another machine - if you thought you were chatting with a person, then the machine showed intelligence. (This spawned countless Eliza and Doctor programs written in many languages...) :)