Disclaimer: This essay is provided as an example of work produced by students studying towards a psychology degree, it is not illustrative of the work produced by our in-house experts. Click here for sample essays written by our professional writers.

Please refer to an authoritative source if you require up-to-date information on any health or medical issue.

History Of The Blue Brain Technology

Paper Type: Free Essay Subject: Psychology
Wordcount: 2998 words Published: 7th Jun 2017

Reference this

Blue brain is basically a Virtual Brain It is the creation of synthetic brain by reverse engineering, the mammalian brains down to the molecular level. That means a machine can function as human brain. No one can ever understand the complexity of human brain.It is more complex than any circuitry in the world.Is it possible to create a virtual brain? Yes,the scientists today are in research to create an artificial brain that can think,respond,take decision,and keep everything in memory. Ultimately,it is to upload human brain into machine so that man can think ,take decisions without any effort.After the death of the body,we will not lose knowledge,intelligence,feelings and memory of that man and can be used for the welfare of human society. Within 30 years, we will be able to scan ourselves into the computers.

2.BACKGROUND

Alan Turing (1912-1954) started off by wanting to “build the brain” and ended up with a computer. In the 60 years that have followed, computation speed has gone from 1 floating point operation per second (FLOPS) to over 250 trillion – by far the largest man-made growth rate of any kind in the ~10,000 years of human civilization. This is a mere blink of an eye, a single generation, in the 5 million years of human evolution and billions of years of organic life. What will the future hold – in the next 10 years, 100 years, 1,000 years? These immense calculation speeds have revolutionized science, technology and medicine in numerous and profound ways. In particular, it is becoming increasingly possible to simulate some of the nature’s most intimate processes with exquisite accuracy, from atomic reactions to the folding of a single protein, gene networks, molecular interactions, the opening of an ion channel on the surface of a cell, and the detailed activity a single neuron. As calculation speeds approach and go beyond the petaFLOPS range, it is becoming feasible to make the next series of quantum leaps to simulating networks of neurons, brain regions and, eventually, the whole brain. Turing may, after all, have provided the means by which to build the brain. On 1 July 2005, the Brain Mind Institute and IBM (International Business Machines) launched the Blue Brain Project1. Using the enormous computing power of IBM’s prototype Blue Gene/L supercomputer2 the aims of this ambitious initiative are to simulate the brains of mammals with a high level of biological accuracy and, ultimately, to study the steps involved in the emergence of biological intelligence. Nevertheless, this defeat of a human master by a computer on such a complex cognitive task posed the question of whether the relevant world of an organism could simply be described by enough if-then conditions. Adaptation and learning algorithms have massively enhanced the power of these systems, but it could also be claimed that these approaches merely enable the system to automatically acquire more if-then rules. Regardless of the complexity of such an operation, the quality of the operation is much the same during any stage of the computation, and this form of intelligence could therefore be considered as ‘linear intelligence’.

3.BLUE COLUMN

The template – the Blue Column – will be composed of ~10,000 neocortical neurons within the dimensions of a neocortical column (~0.5 mm in diameter and ~1.5 mm in height) (FIG. 3). The Blue Column will include the different types of neuron in layer 1, multiple subtypes of pyramidal neuron in layers 2-6, spiny stellate neurons in layer 4, and more than 30 anatomical- electrical types of interneuron with variations in each of layers 2-6. In the rat somatosensory cortex, there are ~2,000 neurons in each of layers 2-6 (~1,500 in layer 5), ~25% of which are interneurons, although the proportions of different types of interneuron differ between layers (see REF. 36). The neurons are connected accord- ing to the fraction of neurons targeted and precisely mapped together using axonal and dendritic maps derived experimentally (which show the location and distribution of the presynaptic boutons on axons of the presynaptic neuron and synapses on the postsynaptic neuron). Synaptic connections are modelled from the physiological record- ings, which provide synaptic biophysics (conductances and kinetics) and dynamics (probability of release, depression and facili-tation time constants). Synaptic plasticity rules are implemented locally and globally to allow adaptation of the NCC. These con-straints provide the initial conditions for the NCC, and iterations between simulations and experiments are expected to provide further constraints on the model.The Blue Brain workflow depends on a large-scale research infrastructure, providing:

1. State of the art technology for the acquisition of data on different levels of brain organization.

2. An IBM 16,384 core Blue Gene/P supercomputer for modeling and simulation A data center providing networked servers for use in data archiving and neuro informatics. In Blue Brain cellular level models, the representation of the detailed electrophysioloy and communication of a single can require as many as 20,000 differential equations. No modern workstation is capable of solving this number of equations in biological real time. In other words, the only way for the project to achieve its goals is to use High Performance Computing (HPC).

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!
Find out more about our Essay Writing Service

4.WORKING

A) INPUT:

In a similar way to the actual neurons present in the brain the scientists have already created artificial neurons by replacing them with the silicon chip. It has also been tested that these neurons can receive the input from the sensory cells. So, the electric impulses from the sensory cells can be received through these artificial neurons and send to a super computer for the interpretation.

B) INTERPRETATION:

The interpretation of the electric impulses received by the artificial neuron can be done by means of a set of register. The different values in these register will represent different states of the brain.

C) OUTPUT:

Similarly based on the states of the register the output signal can be given to the artificial neurons in the body which will be received by the sensory cell.

D) MEMORY:

It is not impossible to store the data permanently by using the secondary memory. In the similar way the required states of the registers can be stored permanently. And when required these information can be retrieved and used.

E) PROCESSING:

In a similar way the decision making can be done by the computer by using some stored states and the received input & by performing some arithmetic and logical calculations.

5.ARCHITECTURE OF BLUE GENE

Blue Gene is built using system-on-a-chip technology in which all functions of a node are integrated onto a single Application Specific Integrated Circuit (ASIC). This ASIC includes 2 PowerPC (ASIC). This ASIC includes 2 PowerPC 440 cores running at 700MHz. Associated with each core is a 64-bit “double” floating point unit (FPU). This leads to a peak performance of 5.6 billion floating point operations per sec. The two CPUs can be used in “coprocessor” mode for computation, the other CPU being used for processing the I/O or in “virtual node” mode. So, the aggregate performance of a processor card in virtual node mode is then calculated using above details. The Blue Brain Projects Blue Gene is a 4-rack system that has 4,096 nodes, equal to 8,192 CPUs, with a peak performance of 22.4 TFLOPS. A 64-rack machine should provide 180 TFLOPS, or 360 TFLOPS at peak performance.

6.MODELLING THE MICROCIRCUIT

The scheme shows the minimal essential building blocks required to reconstruct a neural microcircuit. Microcircuits are composed of neurons and synaptic connections. To connect any two types of neuron are required, in addition to statistics on which part of the axonal arborization is used to contact which regions of the target neuron , how many synapses are involved in forming connections, and the connectivity statistics between any two types of neuron. There is therefore a minimal size of a microcircuit and a minimal complexity of a neuron’s morphology that can fully sustain a neuron. A massive increase in computational power is required to make this quantum leap – an increase that is provided by IBM’s Blue Gene supercomputer. The first phase of the project is to build a cellular-level model of a 2-week-old rat somatosensory neocortex. The combination of infrared differential interference microscopy in brain slices and the use of multi-neuron patch-clamping allowed the systematic quantification of the molecular, morphological and electrical properties of the different neurons and their synaptic pathways in a manner that would allow an accurate reconstruction of the column. Over the past 10 years, the laboratory has prepared for this reconstruction by developing the multi-neuron patch-clamp approach, recording from thousands of neocortical neurons and their synaptic connections, and developing quantitative approaches to allow a complete numerical breakdown of the elementary building blocks of the NCC. The recordings have mainly been in the 14-16-day-old rat somatosensory cortex, which is a highly accessible region on which many researchers have converged following a series of pioneering studies driven by Bert Sakmann. Much of the raw data is located in our databases, but a major initiative is underway to make all these data freely available in a publicly accessible database. The so-called ‘blue print’ of the circuit, although not entirely complete, has reached a sufficient level of refinement to begin the reconstruction at the cellular level. The NCC should not be overly specialized, because this could make generalization to other neocortical regions difficult, but areas such as the bacortex do offer the advantage of highly controlled in vivo data for comparison.

7.UPLOADING THE HUMAN BRAIN

The uploading of human brains is possible by the use of small robots known as nanobots. These robots are small enough to move in our entire circulatory system. Travelling into the spine and brain , they will be able to monitor the activity and structure of our central nervous system. They will be able to provide an interface with the computer while we still reside in our biological form. Nanobots could also carefully scan the structure of our brain , providing a complete read out of the connection. This information when entered into a computer , could then continue to function like us. Thus the data stored in the entire brain will be loaded in the computer.

8.SIMULATING THE MICROCIRCUIT

Once the microcircuit is built, the exciting work of making the circuit function can begin. All the 8192 processors of the Blue Gene are pressed into service, in a massively parallel computation solving the complex mathematical equations that govern the electrical activity in each neuron when a stimulus is applied. As the electrical impulse travels from neuron to neuron, the results are communicated via inter-processor communication (MPI). Currently, the time required to simulate the circuit is about two orders of magnitude larger than the actual biological time simulated. The Blue Brain team is working to streamline the computation so that the circuit can function in real time – meaning that 1 second of activity can be modeled in one second.

9.INTERPRETING THE DATA

Running the Blue Brain simulation generates huge amounts of data. Analyses of individual neurons must be repeated thousands of times. And analyses dealing with the network activity must deal with data that easily reaches hundreds of gigabytes per second of simulation. Using massively parallel computers the data can be analyzed where it is created. Given the geometric complexity of the column, a visual exploration of the circuit is an important part of the analysis. Mapping the simulation data onto the morphology is invaluable for an immediate verification of single cell activity as well as network phenomena. Architects at EPFL have worked with the Blue Brain developers to design a visualization interface that translates the Blue Gene data into a 3D visual representation of the column. A different supercomputer is used for this computationally intensive task. The visualization of the neurons’ shapes is a challenging task given the fact that a column of 10,000 neurons rendered in high quality mesh accounts for essentially 1 billion triangles for which about 100GB of management data is required. Simulation data with a resolution of electrical compartments for each neuron accounts for another 150GB. As the electrical impulse travels through the column, neurons light up and change color as they become electrically active. A visual interface makes it possible to quickly identify areas of interest that can then be studied more extensively using further simulations. A visual representation can also be used to compare the simulation results with experiments that show electrical activity in the brain.

10.HARDWARE & SOFTWARE REQUIREMENT

22.8 TFLOPS peak processing speed.

8096 CPU’s at 700 MHz.

(256-512)MB memory per processor.

Linux and C++ software.

100 KW power consumption.

Very powerful nanobots to act as the interface between natural brain and the computer.

11.APPLICATIONS

A)Gathering and Testing 100 Years of Data:

The most immediate benefit is to provide a working model into which the past 100 years knowledge about the microstructure and workings of the neocortical column can be gathered and tested. The Blue Column will therefore also produce a virtual library to explore in 3D the micro-architecture of the neocortex and access all key research relating to its structure and function.

B)Cracking the Neural Code:

The Neural Code refers to how the brain builds objects using electrical patterns.

The NCC is the elementary network for computing in the neocortex. Creating an accurate replica of the NCC which faithfully reproduces the emergent electrical dynamics of the real microcircuit, is an absolute requirement to revealing how the neocortex processes, stores and retrieves information.

C) Understanding Neocortical Information Processing:

The power of an accurate simulation lies in the predictions that can be generated about the neocortex. Indeed, iterations between simulations and experiments are essential to build an accurate copy of the NCC.

D) A Novel Tool for Drug Discovery for Brain Disorders:

Understanding the functions of different elements and pathways of the NCC will provide a concrete foundation to explore the cellular and synaptic bases of a wide spectrum of neurological and psychiatric diseases. The impact of receptor, ion channel, cellular and synaptic deficits could be tested in simulations and the optimal experimental tests can be determined.

E) A Global Facility:

A software replica of a NCC will allow researchers to explore hypotheses of brain function and dysfunction accelerating research. Simulation runs could determine which parameters should be used and measured in the experiments. An advanced 2D, 3D and 3D immersive visualization system will allow “imaging” of many aspects of neural dynamics during processing, storage and retrieval of information.

F) A Foundation for Whole Brain Simulations:

With current and envisageable future computer technology it seems unlikely that a mammalian brain can be simulated with full cellular and synaptic complexity. Knowledge of the NCC architecture can be transferred to facilitate reconstruction of subcortical brain regions.

G) A Foundation for Molecular Modeling of Brain Function:

An accurate cellular replica of the neocortical column will provide the first and essential step to a gradual increase in model complexity moving towards a molecular level description of the neocortex with biochemical pathways being simulated. The NCC lies at the interface between the genes and complex cognitive functions. This level of simulation will become a reality with the most advanced phase of Blue Gene development.

12.ADVANTAGES

We can remember things without any effort.

Decision can be made without the presence of a person.

Even after the death of a man his intelligence can be used.

The activity of different animals can be understood. That means by interpretation of the electric impulses from the brain of the animals, their thinking can be understood easily.

It would allow the deaf to hear via direct nerve stimulation, and also be helpful for many psychological diseases. By down loading the contents of the brain that was uploaded into the computer, the man can get rid from the madness.

13.LIMITATIONS

We become dependent upon the computer systems.

Others may use technical knowledge against us.

Computer viruses will pose an increasingly critical threat.

The real threat, however, is the fear that people will have of new technologies.

That fear may culminate in a large resistance. Clear evidence of this type of fear is found today with respect to human cloning.

14.CONCLUSION

In conclusion, we will be able to transfer ourselves into computers at somepoint. Most arguments against this outcome are seemingly easy to circumvent. They are either simple minded, or simply require further time for technology to increase. The only serious threats raised are also overcome as we note the combination of biological and digital technologies.

 

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please:

Related Services

Our academic writing and marking services can help you!

Prices from

£124

Approximate costs for:

  • Undergraduate 2:2
  • 1000 words
  • 7 day delivery

Order an Essay

Related Lectures

Study for free with our range of university lecture notes!

Academic Knowledge Logo

Freelance Writing Jobs

Looking for a flexible role?
Do you have a 2:1 degree or higher?

Apply Today!