Neural Computer Unlimited Memory

  • Uploaded by: sreekanth
  • 0
  • 0
  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Neural Computer Unlimited Memory as PDF for free.

More details

  • Words: 2,673
  • Pages: 24
NEURAL COMPUTER AND UNLIMITED MEMORY

IDEA PROPOSED BY SREEKANTH PM KERALA KASARGOD (dt) PH:+919747629216 HTTP://crazydiode.blogspot.com

NOTE: THIS WILL NOT CONTAIN A DETAIL REPORT ON UNLIMITED MEMORY AND NEURAL COMPUTER.. IT IS UNDER RESERCH THOSE WHO INTERESTED TO PARTICIPATE THE RESERCH PROGRAMME CALL US +919747629216 OR MAIL US [email protected]

Mail ure resume here only talented person can get chance… …

NEURAL COMPUTER AND UNLIMITED MEMORY

INTRODUCTION

In this 21 st century, it is an era of new inventions and discoveries. And in this era main experiment conducted on neural network, IT and robotic technology. Both IT and robotic technology is widely used in different areas and in neural networks more invention are going. And in these days we develop so many super computers as like param1,2 etc.. this super computer invented for storing a large volume of data and for getting a high processing speed, but it have limitation that we seen in the experiment of LARGE HADRON COLLIDE ,it is an experiment conducting in Europe and this experiment is also known as the largest and dangerous experiment conducted in this century. This experiment is used for studying the condition of earth and universe before big bang. The out coming result is important we cannot avoid each information and this will happen in fraction of seconds because these type experiment conducting on light travelling speed so there should be highly efficient computer for saving that data and in hadron collider if the out coming results are stored in compact disc and it make it

aligned it should have a length of 20 km in a day. Here we need a high memory having high processing speed. For this experiment here they used nodes of super computer and also we know that secondary memory having less execution than a processor. There we can propose a idea a unlimited memory, high processing speed as compared as our imagination. That is my topic neural computer, innovative thoughts and dreams an unlimited memory.

NEURAL COMPUTER

ABSTRACT It is completely a new idea, our topic is to make our selves as a computer and if there is resources we can make unlimited storage memory through biological and electronic concepts. This concept is work with the basic principle of neural networks. In neural networks it can not produce miracles, but sensibly they can produce some amazing results. Here we trying to make an artificial neural network, An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the

way biological nervous systems, such as the brain, process information. The key element of this paradigm is the novel structure of the information processing system. It is composed of a large number of highly interconnected processing elements (neurones) working in unison to solve specific problems. ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurones. This is true of ANNs as well. In this topic our aim is to make storing a data to brain or genomes that help us to remember through an electronic device. we don't think that it is a one year project. For this project we first consider how the neuron activated while we learning something and which type of pattern that it generated while we learning then we converted our neuron through an electronic system because we know that neuron produces a electric signal corresponding to its input. For this we have to know how human brain is learned some thing, Much is still unknown about how the brain trains itself to process information, so theories abound. In the human brain, a typical neuron collects signals from others through a host of fine structures called dendrites. The neuron sends out spikes of electrical activity through a long, thin stand known as an axon, which splits into thousands of branches. At the end of each branch, a structure called a synapse converts the activity from the axon into electrical effects that inhibit or excite activity from the axon into electrical effects that inhibit or excite activity in the connected neurones. When a neuron receives excitatory input that is sufficiently large compared with its inhibitory input, it sends a spike of electrical activity down its axon. Learning

occurs by changing the effectiveness of the synapses so that the influence of one neuron on another changes.

next step is human neuron to artificial neurons for this some scientists make a model programming it follows as like this

engineering approach to this project As above said we first learned how the neuron works and what type of pulses that produce while we learning. We make a electronic system that convert messages that are to be learned is

converted to a form that it can stimulate the neuron and brain. And we consider each rule in electronics in this field, for describing these we need the help of FIRING RULE.The firing rule is an important concept in neural networks and accounts for their high flexibility. A firing rule determines how one calculates whether a neuron should fire for any input pattern. It relates to all the input patterns, not only the ones on which the node was trained.

COURSE OF THOUGHT This is a new concept, now it is just proposed this idea and further researches are going very rapidly and the research results were kept secretly because it have some

ethical problems these will discuss latterly. Actually it a idea come out for making a unlimited memory and it should have high processing speed and we can avoid the use of cache memory. And for this concept the basics of this concept is neural network. In these days there is a innovative invention on artificial neural networks. In this idea it mainly used biological neuron accompanying with some basic principle used in the artificial neural networks. Concept behind this idea Concept behind this idea is that choosing an element for making a memory so here chose an element that is neuron. The specialty for choosing neuron as an element is that 1. Adaptive learning Adaptive learning is the property of neural network that is a neural net work can perform a work with considering its input. For a computer network there should be programmed interface or its corresponding software for identifying this or by manually we should point that particular that input to a software in neuron that is not needed that property is known as adaptive learning. 2. Self organization Self organization means a neural network can its own organization or representation for given information. 3. Real time operation

It is a special property of a neuron that it can performed a multitasking job and in this modern world we performed with the help of PLC and SCADA 4. FAULT TOLERANCE Fault tolerance means a neural network have high capability to avoid error or fault. In modern devices such like processors these are the main challenges. They improve to get these but for a neuron it is a common property. Because of these properties it can considered as an element to making an un limited memory. In this memory system we used here biological neuron as processors and data buses, we know the processing speed of neuron for knowing processing speed of neuron just analyze its reflex action and we can see even light its travelling 3*10^8 m/s.

BUILDING CONCEPTS OF A NEURAL COMPUTER 1.How a neural network performs an operation i) FIRING RULE The performance of neural network mainly based on FIRING RULE. A simple firing rule can be implemented by using Hamming distance technique. The rule goes as follows: Take a collection of training patterns for a node, some of which cause it to fire (the 1-taught set of patterns) and others which prevent it from doing so (the 0-taught set). Then the patterns not in the collection cause the node to fire if, on comparison , they have more input elements in common with the 'nearest' pattern in the 1-taught set than with the 'nearest' pattern in the 0-taught set. If there is a tie, then the pattern remains in the undefined state. For example, a 3-input neuron is taught to output 1 when the input (X1,X2 and X3) is 111 or 101 and to output 0 when the input is 000 or 001. Then, before applying the firing rule, the truth table is; X1:

0

0

0

0

1

1

1

1

X2:

0

0

1

1

0

0

1

1

X3:

0

1

0

1

0

1

0

1

OUT:

0

0

0/1

0/1

0/1

1

0/1

1

As an example of the way the firing rule is applied, take the pattern 010. It differs from 000 in 1 element, from 001 in 2 elements, from 101 in 3 elements and from 111 in 2 elements. Therefore, the

'nearest' pattern is 000 which belongs in the 0-taught set. Thus the firing rule requires that the neuron should not fire when the input is 001. On the other hand, 011 is equally distant from two taught patterns that have different outputs and thus the output stays undefined (0/1). By applying the firing in every column the following truth table is obtained;

X1:

0

0

0

0

1

1

1

1

X2:

0

0

1

1

0

0

1

1

X3:

0

1

0

1

0

1

0

1

OUT:

0

0

0

0/1

0/1

1

1

1

The difference between the two truth tables is called the generalization of the neuron. Therefore the firing rule gives the neuron a sense of similarity and enables it to respond 'sensibly' to patterns not seen during training.

ii) DATA TRANSMISSION IN A BIOLOGICAL NEURON A neuron is simply a transducer which converts physical energy to electrical signals and it has following procedures:

2. THRESHOLD NEURON

When a neuron is not sending a signal, it is "at rest." When a neuron is at rest, the inside of the neuron is negative relative to the outside. Although the concentrations of the different ions attempt to balance out on both sides of the membrane, they cannot because the cell membrane allows only some ions to pass through channels (ion channels). At rest, potassium ions (K+) can cross through the membrane easily. Also at rest, chloride ions (Cl-)and sodium ions (Na+) have a more difficult time crossing. The negatively charged protein molecules (A-) inside the neuron

cannot cross the membrane.

In addition to these selective ion

channels, there is a pump that uses energy to move three sodium ions out of the neuron for every two potassium ions it puts in. Finally, when all these forces balance out, and the difference in the voltage between the inside and outside of the neuron is measured, you have the resting potential. The resting membrane potential of a neuron is about -70 mV (mV=millivolt) - this means that the inside of the neuron is 70 mV less than the outside. At rest, there are relatively more sodium ions outside the neuron and more potassium ions inside that neuron.

Action Potential

The resting potential tells about what happens when a neuron is at rest. An action potential occurs when a neuron sends information down an axon, away from the cell body. Neuroscientists use other words, such as a "spike" or an "impulse" for the action potential. The action potential is an explosion of electrical activity that is created by a depolarizing current. This means that some event (a stimulus) causes the resting potential to move toward 0 mV. When the depolarization reaches about -55 mV a neuron will fire an action potential. This is the threshold. If the neuron does not reach this critical threshold level, then no action potential will fire. Also, when the threshold level is reached, an action potential of a fixed sized will always fire...for any given neuron, the size of the action potential is always the same. There are no big or small action potentials in one nerve cell - all action potentials are the same size. Therefore, the neuron either does not reach the threshold or a full action potential is fired - this is the "ALL OR NONE" principle.

Action potentials are caused by an exchange of ions across the neuron membrane. A stimulus first causes sodium

channels to open. Because there are many more sodium ions on the outside, and the inside of the neuron is negative relative to the outside, sodium ions rush into the neuron. Remember, sodium has a positive charge, so the neuron becomes more positive and becomes depolarized. It takes longer for potassium channels to open. When they do open, potassium rushes out of the cell, reversing the depolarization. Also at about this time, sodium channels start to close. This causes the action potential to go back toward -70 mV (a repolarization). The action potential actually goes past -70 mV (a hyperpolarization) because the potassium channels stay open a bit too long. Gradually, the ion concentrations go back to resting levels and the cell returns to -70 mV.

i)L OGICAL STATUS OF A THRESHOLD NEURON

0  1; 1  -1; y  {0 ,1}, x  {1,-1}  x = 1 - 2 y  (1) y

ii)PERCEPTRON(THRESHOLD) ACTIVATION FUNCTIONS

iii)THRESHOLD BOOLEAN FUNCTIONS

f ( x1 ,..., x n ) W = ( w0 , w1 ,..., wn )

f ( x1 ,...xn ) = sign( w0 + w1 x1 + ... + wn xn )

Iv)THRESHOLD BOOLEAN FUNCTIONS AND THRESHOLD NEURONS

T 2

2n

0 → n> 3

V)GEOMETRICAL INTERPRETATION OF THRESHOLD BOOLEAN FUNCTIONS

ESSENCE OF THIS IDEA

The objective and aim of this topic is that making a un limited memory by using a neuron as a element. Neuron is not only an element by technically it is a data bus. And for an un limited memory we need a storing or memory for this we use our brain itself means human brain itself. Because human brain have two parts 1 . short term memory 2. long term memory In short term memory it act as like as a ram it can success only when we are in conscious state and longterm memory is act like as a ROM in computer memory. The entire data is stored in our long term memory that is the cap[acity of oyr long term memory is unpredictable before the deep study scientists thought that make long term memory as a memory element. And by using this long term memory as according to this idea all data sends to brain to its long term memory without reading a lesson , without seeing any visuals and without hearing any audio signals we can send a data directly to brain. And this can be possible by stimulating neuron by externally. Stimulation of neuron can done externally by using a electric field.

How we can stimulate a neuron externally

External stimulation done in a bird

This is the set up done for the external stimulation of a bird is Zebra finc and by stimulating its neuron they produce its sound and it decoded directly from its brain by brain mapping technique.

Transmission of data to the brain A data can transfer to the brain with external stimulation and making a signal corresponding to the data that have to store in the brain. For making this we have to study each patern of the brain wave patern for reading, visuals and this can taken by scanning such as MRI eeG ETC.. And studying each paterns we can produce externally that type of a signal and by using a transmitter a computer interface we can send this data directly to the brain.

APPLICATION

1. By using this we can make unlimited memory 2. it can be used in hadron collider systems 3. if this is a success we can cure the disease parkinsons and paralysys 4. And another thing is that we can forget blindness, dumph and dump etc..

Related Documents

Computer Memory
November 2019 3
Topic 2 Computer Memory
October 2019 21
Neural
December 2019 37
Salvation Unlimited
May 2020 10

More Documents from ""