Disclaimer: This is an example of a student written essay.
Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

Claude Shannon Genius Comparable To Einstein Philosophy Essay

Paper Type: Free Essay Subject: Philosophy
Wordcount: 2575 words Published: 1st Jan 2015

Reference this

The world is filled with all types of technologies and information. There are computers that make normally tedious tasks simplier, phones that allows instant communication across the world, CDs that can store large amounts of information in a small area. In modern times, we take for granted how much easier our daily life is with these innovations. It truly is a large step from the candles centuries ago to flicking a switch to brighten up a room. Of course, this step couldn’t have been taken without the work and studies by Claude Shannon. When described, Claude Shannon is said to have an ever lasting curiosity and a thirst for knowledge. Shannon is even compared to Albert Einstein for his ability to come up with strange yet groundbreaking ideas. He came up with the idea of digital circuitry, found a way to represent human genetics with algebra, an important theory about information, and has done a lot of work in cryptology. Even then, that is just scratching the surface of what he has accomplished. Just one of these accomplishments could have forever written him into the textbooks. It is important to note that Shannon, while intelligent, was in no means the most knowledgeable person there is. Like Einstein, Shannon’s genius came from his drive to satisfy his curiosity and ability to think outside the box. Claude Shannon’s pursuit of knowledge led to large technological advancements that shaped America and the rest of the world.

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!

Essay Writing Service

Throughout his entire life, Claude Shannon has achieved many accomplishments that effectively makes him forever remembered throughout America and, to a lesser degree, the world. Shannon was born in Petoskey, Michigan on April 30, 1916. While growing up, he was talented in the fields of science and mathematics and looked up to a famous distant relative, Thomas Edison. While his father made mathematics a hobby, it was his grandfather that helped influence Shannon’s love for science. Shannon built many small devices such as a telegraph or a remote controlled model boat in his free time as a child. This child like urge to invent and discover lasted throughout nearly his entire life.

During 1932, Shannon graduated high school and then enrolled at the University of Michigan a short time later. Four years later in 1936, Shannon graduated with a B.S. degree in Electrical Engineering as well as a B.S. degree in Mathematics. After his graduation from the University of Michigan, he entered the Massachusetts Institute of Technology as a research assistant to both further his studies towards a higher degree and work part time. It was during this time that he wrote his master thesis on digital circuitry as well as his doctorate thesis for representing genetics with algebra. In 1940 his earned his masters degree in electrical engineering and doctorate in mathematics.

After his graduation, Shannon went on to work at Bell Telephone Laboratories. For over a year, he did numerous amounts of works such as creating a new design for switching circuits. In 1941, a committee was formed to design anti-aircraft detectors in order to improve the war effort. Shannon was invited to join, and helped finish the completed design. It was due to this that the bombing effort against England ended with less casualties than there could have been. For the next 15 years, Shannon spent his time amongst many successful and important mathematicians and engineers. This time period is also when Shannon developed his Information Theory, which was published in 1948.

Claude Shannon’s contributions to America received a lot of recognition. For his theory on digital circuits, he received the Alfred Noble Prize reward in 1939. President Lyndon B. Johnson presented Shannon with the National Medal of Science in 1966, and in the same year he was given the Institute of Electrical and Electronics Engineers Medal of Honor. In 1985, Shannon also received the Kyoto Prize which is commonly compared to the Nobel Prize in America. Among many other rewards, Shannon also has nearly a dozen honorary doctorates in various universities and was inducted in the National Inventors Hall of Fame.

Even after he became an adult, Shannon maintained his childish personality, his passion for inventing, and drive to broaden his knowledge. In his life, he made numerous different small ‘toys’, many of which he felt were just as important, if not more so, than his theories that changed the world. John Horgan told his experience with interviewing Shannon, “I’m trying to get him to recall how he came up with the theory of information. But Shannon…is tired of expounding on his past. Wouldn’t I rather see his toys?” (**) Shannon most likely felt that each of his toys were just as important as anything else he thought of. He just wasn’t content with coming up with revolutionizing ideas, no, he went above and beyond to discover everything he could. This is why Shannon was immortalized as one of the greatest thinkers. It wasn’t a matter of finding fame and fortune, he was just as content to create robots that can juggle as he was for creating his famous information theory.

It is the fate of every living creature to eventually die, and although Shannon’s ideals will be forever immortalized, he too could not resist this fate. He died on February 24, 2001, after losing a battle against Alzheimer’s disease. Shannon is survived by his wife Mary Elizabeth Shannon, along with three children.

In the 18th century, a genius mathematician named George Boole created a method to solve or model logical statements using algebraic expressions. Boole named his concept Boole’s system, and is more commonly known as boolean logic. This logic revolves around ones and zeroes along with ‘logic gates’ that take input(s) and then give out an output. Another way to think of these ones and zeroes is true and false, or on and off. At the time of creation, George Boole received little to no praise for developing this system – after all, there didn’t seem to be any real use with it. While George Boole died without his system going anywhere, Claude Shannon found out the huge amount of potential George Boole’s idea had when applied to circuits.

While at MIT, Claude Shannon worked with an associate named Vannevar Bush on studying an analog computer called a differential analyzer. This computer used wheel and disk mechanisms in order to solve equations typically encountered in calculus. Shannon noticed that the circuits used in the computer had only two states of being, namely on and off. Reminded of Boole’s system from his math courses, Shannon thought about the possibility of applying that logic to circuits and realized it could open up a wide range of new possibilities and usages. Shannon used this discovery for his master thesis at MIT, called A Symbolic Analysis of Relay and Switching Circuits. H. H. Goldstine, in his book The Computer from Pascal to Von Neumann, called Shannon’s these one of the most worlds important master’s thesis ever written (6).

Digital circuitry is based heavily on Boole’s system. Yet, instead of theoretical ones and zeroes, it uses two states of being – on and off. Think of a button that was pressed, will send electricity to a light, thus lighting it up. Now, this doesn’t have much variety. Press the button and the light goes on, release it and the light goes off. If a logic gate from boolean logic is borrowed, say the AND gate, then there is a large amount of possibilties that open up. An AND gate would only give out electricity to the light if and only if it has two sources of electricity flowing into it. If there are two buttons that each lead to the AND gate, than the light would only go on when both buttons are pushed – or rather when both buttons give electricity to the AND gate. While this is a simple example of how digital circuitry works, there are plenty of other more useful examples including lightning quick mathematical calculations or permanent data storage that can be read and editted. Of course, those are rather complex designs to complete.

No matter where one looks, digital circuitry is prevalent. It forms of the core of every digital device ranging from something as simple as a bedside lamp to the digital computers used to browse the internet. Without the idea of digital circuitry, the world would be a vastly different place than what it currently is. Electrical engineers immediately adapted his ideas on digital circuitry for their use in World War II. The creation of pocket calculators were made, removing the necassity for slide rules in many jobs. The home computer started to come into existance a couple decades afterwards. Just about everything that was invented decades afterwards and used electricity relied on Shannon’s thesis. Even technology that was invented before Shannon’s thesis was published could be refurbished into a device far more efficient, accurate, and of a higher quality overall.

Find Out How UKEssays.com Can Help You!

Our academic experts are ready and waiting to assist with any writing project you may have. From simple essay plans, through to full dissertations, you can guarantee we have a service perfectly matched to your needs.

View our services

Communication across distances didn’t always start out with crystal clear messages, nor was communication properly understood. Before the mid 20th century, the idea of telegraphs, telephones, television, and similar devices were rigid and unadaptable. It was thought that telephones could only send signals that represent voices, and only that. As such, a situation such as sending a video over a phone line would be dismissed as fantasy back then, even though it’s done currently. This all changed with the publishing of Shannon’s information theory in his paper, Mathematical Theory of Communication.

Claude Shannon didn’t just pop out a theory after thinking for a little while. Although he can claim to have developed the information theory fully, he did have help from research a few decades previously. Harry Nyquist’s paper in 1924, Certain Factors Affecting Telegraph Speed, has multiple important ideas inscribed in it. For instance, he started to stray from the idea of focusing on the content of the signals, and instead focused on the fact that the signals are information. Nyquist also developed a formula to determine that max amount of ‘intelligence’ that can be described in a message. Of course, it still had its flaws for it only worked on a telegraph wire’s. Four laters later, in 1928, another engineer named R.V.L. Hartley wrote a paper that improved on Nyquist’s rule to work on more systems of transmission. This paper emphasised that transmitting information should only depend on making sure the transmission from start to end is distinguished, without outside signals intruding nor one worry about the meaning of the information. Shannon cited the works of both Nyquist and Hartley in his paper, and when interviewed decades later, he mentioned the importance of their ideas to his own (**).

Now what is this ‘all important’ information theory that keeps being mentioned? In its most basic form, it contains two parts. First, it gives the general idea of on the definition and measurement of information. Information is based on the logarithm of possible symbols availaible. Shannon used the logarithm base of 2, which mean that the smallest unit of information is represented by either a zero or a one, which was called a bit. Sound familiar? The second part of the information theory contains details on the limits of information being sent, as well as the effect of outside interference, also called noise, on the information. In the past, engineers were limited by how much information could be sent, often thinking it depended on factors like frequency. Shannon use his theory to prove that by using the concept of entropy, or randomness, along with statistical probability to get the maximum amount of information possible. Shannon was also proved how to transmit information error free, despite however much noise there may be.

The information theory itself can be complex to understand, yet it is simple to understand its many benefits. These benefits that exist solely due to the existance of the information theory are diverse. Not only is the information theory used in communication and computing, but also in psychology, linguistics, and even thermal physics. Many plagiarism detection programs use the information theory in order to measure shared information. There is the coding theory, which is in its simplest terms error detection and correction. Computer programmers, for example, are assisted in debugging glitches in software using the coding theory. More important however, is that it’s the reason why CD’s can still function properly even when scratched. The information theory led to data compression techniques, which in turn led to new useful file types such as ZIP and MP3. The theory was also crucial for the function of the internet. Even the success of space exploration programs depended on the information theory to reduce the problems of noise and static caused from the enourmous distance between planets.

The accomplishments made by Claude Shannon had a large impact on how the United States developed as well as how the world lives today. Just about anything in the world that uses information exists due to Claude Shannon’s work. Electronics, ranging from simple lamps to supercomputers, are all based upon digital circuitry. While it is arguable that someone would have came up with a similar idea of digital circuitry in the next decade, the United States benefited most from its immediate discovery.

Shannon has also worked on cryptology during World War II, managing to help decode enemy transmissions and played a large role in the encryption of US messages. Even presently Claude Shannon’s influence is felt, new inventions are being created that depend on his ideas. It is much like a tree, where Shannon is the trunk with new innovations being the branches that continue to grow outward.

Much of Shannon’s success is due to his insatiable hunger for knowledge. Most of the worlds population would be content to have done even a tenth of what Shannon has done, yet Shannon himself never was. This is likely because of Shannon’s child like personality. In his spare time, he developed numerous small trinkets. Not because his goal was fame and fortune, but simply because he wanted to. Why talk about his ground breaking information theory, when his juggling robots are just as important to him? Shannon just didn’t want to stop thinking, even at an old age, because inventing and theorizing was fun for him. In an interview he stated, “I am more interested in the elegance of a problem. Is it a good problem, an interesting problem?” [66] Near the end of his life, he worked on artificial intelligences. Computers that could match the best chess players was an intriguing idea, just like his mouse that could adapt to solve any maze, or a rubix cube solver. While others his age were relaxing in retirement, Shannon enjoyed thinking of new ideas, theories, and discoveries. Talent or genius isn’t determined by the efforts or intelligence of a person, but rather their ability to pursue their options whole heartedly.

 

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: