Difference Between Neural Networks And Conventional Computers Information Technology Essay
✅ Paper Type: Free Essay | ✅ Subject: Information Technology |
✅ Wordcount: 2438 words | ✅ Published: 1st Jan 2015 |
The first artificial neuron was produced by a neurophysiologist Warren McCulloch and the logician Walter Pits.
Main applications of artificial intelligence are for pattern recognition and to derive meaning full data from complicated and imprecise data. It can be used to extract patterns and detect trends that are too complex for either humans or computers to notice.
Computers can only process information which they know earlier they work using algorithmic approach; it can only follow instructions in order to solve a problem. It can only solve problems that we already understand and know how to solve.
Neural networks in contrast to algorithmic computers tend to learn from previously processed data. They will produce patterns from data which a normal computer or human evaluation cannot. It can solve a variety of problems including pattern recognition, optimization, associative memory etc.
Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Find out more about our Essay Writing Service
Biological Neural networks:
‘Neurons are the basic signalling units of the nervous system and each neuron is a discrete cell whose several processes arise from its cell body’. These are responsible for processing information and also for transmitting and receiving information. Each part of the neuron plays a role in the communication of information throughout the body.
Neuron has four major regions
The Cell Body (Soma)
The dendrites
The axon
Presynaptic terminal buttons
Cell body: This is the place where dendrites are joined and signal is processed and passed on .Soma and nucleus do not play active role in transmission of neural signals.
Dendrites: These are treelike extensions at the beginning of neuron that help increase the surface area of cell body and are covered with synapses. They receive information from other neurons and transmit electrical stimulation to soma.
The axon: It is the elongated fiber that extends from the cell body to the terminal endings and transmits neural signals. Larger the length of Axon, faster it transmits the information. Some axons are covered with fatty substance called ‘myelin’ that acts as an insulator. These kind of axons transmit information much faster than the other axons without myelin insulator.
The terminal buttons: They are located at the end of the neuron and are responsible for transmitting signal to other neurons. At the end of the terminal button is a gap known as a synapse. Neurotransmitters are used to carry the signal across the synapse
to other neurons.
Structure of a Neuron
C:UsersRenevatioDesktopneuron.gif
[diagram ref:www.bcb.uwc.ac.za/…/mammal/images/neuron.gif]
Millions of such neurons are interconnected to process data and to store it , some of which take inputs , some send outputs and the others are responsible for processing and storing information.
Processing and Transmission of information in BNN:
The inside of the cell body is filled with intercellular fluid and the outside with extracellular fluid. When a neuron is excited above a certain level of threshold, the neuron transmits a electrical signal, through the axon. The end o axon is connected with dendrites of other neuron’s dendrite and this is called synapse. A single neuron may consist of 103 to 104 and these in turn may be connected to 1000 other neurons. There is a specific pattern in which they are excited and when the potential is reached chemical messengers’ i.e. neurotransmitters are released. This induces the transmission of information to other respective neurons that are connected to the source neuron in a 3dimensional manner.
Synapse
[img ref: http://rstb.royalsocietypublishing.org/content/362/1479/473.full.pdf]
Mathematical Model of Neural Network:
An artificial neural network is an abstraction of mathematical models of biological nervous systems. After the beginning of simplified neurons by McCulloch and Pitts a first beat of attention in neural networks are revealed.
The artificial neurons is a fundamental processing rudiments of neural networks which is also called simply neurons, the possessions of the synapses are depicted by connection weights that transform the result of the related input signals, and the neurons which is presented by nonlinear attributes is symbolized by a transfer function. The neuron impulse is then calculated as the weighted sum of the input signals, which is transformed by the transfer function. By adjusting the weights in suitability to the preferred learning algorithm the learning potential of an artificial neuron can be achieved.
Artificial Neuron network
A simple neuron:
An artificial neuron contains several inputs and one output. It has two modes of operation; the training mode and the using mode. In training mode Neuron can be trained for fussy input patterns where as in using mode, at the input the trained input pattern is detected and related output turns into current output.
report
A Simple Neuron
Why use Neural Network:
Neural networks have a capability to explain problems which do not have an algorithmic solution or the presented solution is excessively complex to be found. The problems that people are good at solving can be deal with using the neural network. The problems may include prediction and pattern recognition. Neural network have been useful within the medical region for clinical diagnosis, signal analysis and interpretation, image analysis and interpretation, and drug development.
Neural networks are relevant to multivariate non-linear troubles. Neural Network can be used to detect all probable connections between predicator variables, and ease of use several training algorithms. Unlike the conventional computers, neural network uses distinctive access to solve the problems. In order to solve the problem computers uses set of instructions for limited solution gaining capability than the neural networks. Neural network process the information in the same manner as the human brain performs. Neural network can be treating to carry out statistical modelling and give an innovative alternative to logistic regression. It is a mixture of collect of interrelated with the neurons to work out the fussy problem. Disadvantages include reducing over fitting requires a large compact of computational attempt, “black box” nature, greater computational burden, the empirical nature of model growth and the sample has to be large.
Parallel distributed computing can conquer effort with processing massive amounts of data connected with serial computation. Biological systems are capable to accomplish very high processing rates; in contrast with digital computers operating at greatly higher speeds could be the evidence for this.
A typical neural network is composed of input units X1, X2, … corresponding to independent variables a hidden layer known as the first layer, and an output layer (second layer) whose output units Y1, … correspond to dependent variables
C:UsersVANISRIDHARDesktopp017-1.gif
Where X1, X2 are input layers.
H1, H2 are output layers.
Y1 is output layer.
Weight matrices W (1) and W (2) are used to adjust weights. The values of the hidden units are obtained from the formulas
http://www.tfhrc.gov/safety/98133/ch02/p017-2.gif
The activation function f is usually of sigmoid form and may possibly be a logistic function, hyperbolic tangent etc. http://www.tfhrc.gov/safety/98133/ch02/p018-1.gif
The approximate output is enhanced with a repetitive learning method in which the output for a variety of inputs vectors are compared with targets and an average error term E is computed
http://www.tfhrc.gov/safety/98133/ch02/p018-2.gif
Pattern Recognition:
Pattern recognition is a important application of neural network which possibly carried out by using trained feed-forward neural network as shown in fig. In order to connect outputs with the input patterns, the network should be trained. It recognize the input pattern and attempts to output the related output pattern. The supremacy of neural network arrives when a pattern that has no output connected by it, is specified as an input. In situation, it delivers the output that matches to a trained input pattern in other words least altered from the given pattern.
Types of Neural Network:
Neural Networks can be arranged on the basis of:
Feedforward / Feedback
Global /Local Learning
Supervised / Unsupervised learning
Feedforward networks:
These networks permit signals to travel one direction only; from input to output. There is no loops. The output of any layer does not concern that same layer.
Feed-forward ANNs (figure 1) allow signals to travel one way only; from input to output. There is no feedback (loops) i.e. the output of any layer does not affect that same layer. They are widely used in pattern recognition. This kind of organisation is also called as bottom-up or top-down.
C:UsersVANISRIDHARDesktopimg2.gif
Simple Feedforward Networks
Feedback Networks
These networks permit signals to travel in both directions by introducing loops in the network. It is very powerful and be capable of get exceptionally complicated. These have connections from outputs to inputs; a neuron could use its hold output as an input.
report
Global/Local Learning
The adjustment in each weight of global learning affects all outputs of the network. For example backpropagation network.
A weight adjustment in local learning affects one or only a few outputs.
The training examples are available when global learning are used.
Real time application are useful in local learning.
Supervised/Unsupervised:
Supervised learning claims the learning process to be absorbed: training examples are supplied.
Supervised learning (competitive, self-organising) – neurons participate to act in response to input patterns. No precise direction is known from the external environment.
Perceptron:
Perceptron introduced in the late 1950’s by Minsky and Papert and the Perceptron term coined by frank Rosenblatt and attempted to understand human memory learning and cognitive processes. With some additional, fixed, pre-processing the Perceptron becomes to be an MCP model which contains neuron with weighted inputs. Component labelled A1, A2, Aj, Ap are known as association components. The function of these components is to extract specific, localised featured from the input images. Perceptrons imitate the essential proposal behind the mammalian visual system. Even though their skilled extended a lot more these were essentially used in pattern recognition. In 1969 Minsky and Papert published a book which contains the description of limitation of single Perceptrons. The book explained precisely that single layer perceptrons could not do pattern recognition operations like whether shape is connected or not. In 1980’s they realised that providing suitable training to multilevel perceptrons, it can do these operations
http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.perceptron.jpg
Model of a Perceptron:
+—————+
A[1] o—— |W[1] T |
A[2] o—— |W[2] +———+ +——————-+
. | . | ___ |_________| __ Squarewave |_______ Output
. | . | | S | __| Generator | /
. | . | /__ | +——————-+
A[n] o—— |W[n] | Sum |
+—–+———+
S = T + Sum( W[i]*A[i] ) as i goes from 1 -> n
Output = 1 if S > 0; else -1
Where A[n] are the perceptron’s inputs
W[n] is weights which are applied to related input
T is the Threshold
The square wave generator just twists the output into positive or negative number. Whenever we give perceptron, Depending on the input’s weights and a threshold we will get positive or negative output.
Perceptron are two types
Single layer network
Multilayer network
Single layer network:
A single layer perceptron network composed of one or more artificial neurons in parallel. Each neuron in this provides one network output, and is usually linked to all of the external inputs.
C:UsersVANISRIDHARDesktoplayer.gif
Multilayer network:
A multilayer perceptron is a feedforward artificial neural network model that represents sets of input data onto a set of appropriate output. We know that simple perceptrons are restricted in their characterizational capabilities. Consider simple layer cannot represent XOR function. Anyway, it is simple that XOR can be denoted by a multilayer perceptron by writing in terms of basic functions AND, OR, and NOT.
mlpfdfwd
Characteristics of Neural Networks:
Autonomy
Modelling capabilities
Understand ability
Adaptability
Robustness
Complexity
Processing units
Each unit performs a relatively simple job: receive input from neighbours or external sources and use this to compute an output signal which is propagated to other units. Apart from this processing, a second task is the adjustment of the weights. The system is inherently parallel in the sense that many units can carry out their computations at the same time. Within neural systems it is useful to distinguish three types of units: input units (indicated by an index i) which receive data from outside the neural network, output units (indicated by an index o) which send data out of the neural network, and hidden units (indicated by an index h) whose input and output signals remain within the neural network. During operation, units can be updated either synchronously or asynchronously. With synchronous updating, all units update their activation simultaneously; with asynchronous updating, each unit has a (usually fixed) probability of updating its activation at a time t, and usually only one unit will be able to do this at a time. In some cases the latter model has some advantages.
Cite This Work
To export a reference to this article please select a referencing stye below:
Related Services
View allDMCA / Removal Request
If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: