Part 1 of 'All About Neural Networks'


Want to become a writer at Eat My News? Here is an opportunity to join the Board of Young Leaders Program by Eat My News. Click here to know more:  bit.ly/boardofyoungleaders



A neural network is a complex adaptive model that is based on biological neural networks. A typical neural network would consist of thousands and even millions of artificial neurons called units. Some of them are called input units.

These units are the part that is fed by information from the outside world in order to be trained or learned to recognize a certain pattern or carry out a specific process that yields a desired output. This part is sort of the first layer of a neural network.

The second layer is what is referred to as the “Black Box”. It contains one or more layers of hidden units, which, together, form the majority of the artificial brain. Most neural networks are fully connected, which means each hidden unit and each output unit is connected to every unit in the layers either side.

Each of these connections is referred to as a weight, which can be either positive (stimulating) or negative (inhibiting). The higher the weight, the more influence one unit has on another.

Although a simple neural network would be composed of three layers to tackle down a simple problem, however, it could also consist of many different layers between the input and the output.

A more complex structure like this is called a deep neural network - it's typically used for tackling much more complex problems. Hypothetically, a deep neural network can map out any kind of output to any kind of input. However, there is a drawback.

It needs to be fed with millions or billions of examples in comparison to possibly the hundred or thousand examples a simple neural network would need - in other words, it needs much more training.


How Does a Neural Network Learn?

A neural network functions or processes by following a network design called feed-forward network. The information flows to the neural network in two ways, whether it is learning or operating normally where the neural network is being trained or it was operating after being trained, respectively.

In either case, the patterns of information are fed to the input units which triggers the hidden units which in turn arrive at the output units. The way this operates is that each unit receives the values of the units in its left multiplied by the weights they are connected with as they travel along.

Every unit adds up all the units in the same manner and if the sum is more than a certain threshold value or bias, the unit “fires” and stimulates the units it’s connected to on its right.

Imagine yourself as a kid and you are trying to learn anything. Let’s say soccer, for example, specifically, you are trying to improve your penalty shots. Each time you try to score, you mess and then observe what you are doing wrong and treat it as a “feedback”.

You analyze this feedback and spot out what you have done wrong to avoid doing it the next time you shoot again. You would do this for 10 or 100 times a day until you reach the “desired product” and make it match with your shot.

This is basically how a neural network learns by back-propagation. It is a feedback system that works on enhancing the output of the neural network until it reaches the desired one by comparing them together and using the difference between them to modify the weights of the connections between each unit but going from the output connections to the input connections - in a backward manner, hence called back-propagation.

This process continues until the difference between the output and the desired one is as small as possible. Only then, the neural network will behave exactly as it should be.


Conclusion

There is no doubt that the applications of neural networks, such as image recognition and medical diagnosis, are abundant and sometimes makes our life easier.

A technology that is, in a lessened way, acts as a simulation to the brain and can be trained significantly to carry out various functions to yield specific desired outputs. This is interesting. And I don’t know how you would describe it as something else rather than being revolutionary or immensely astounding.



- Written by Eyad Aoun (EMN Community Member From Egypt)

- Edited by Mridul Goyal (EMN Community Member From New Delhi, India)