Finalizing The process of Creation

(Photo: Google pics)
The encoding process, as well as the working process, is very similar to a mind control robot …follow the link below to know about the brain control robot…



Now comes the concept of deep learning. So, What is it?

Deep learning might cluster raw text such as emails or news articles. Emails full of angry complaints might cluster in one corner of the vector space, spambot messages might cluster in others. Deep-learning networks perform automatic feature extraction without human intervention, unlike most traditional machine-learning algorithms.

Each step for a neural network involves a guess, an error measurement and a slight update in its weights, an incremental adjustment to the coefficients.

A collection of weights, whether they are in their start or end state, is also called a model, because it is an attempt to model data’s relationship to ground-truth labels, to grasp the data’s structure. Models normally start out bad and end up less bad, changing over time as the neural network updates its parameters.

This is because a neural network is born in ignorance. It does not know which weights and biases will translate the input best to make the correct guesses. It has to start out with a guess, and then try to make better guesses sequentially as it learns from its mistakes.

Here is a simple explanation of what happens during learning with a feedforward neural network, the simplest architecture to explain.

Input enters the network. The coefficients, or weights, map that input to a set of guesses the network makes at the end.

input * weight = guess

Weighted input results in a guess about what that input is. The neural then takes its guess and compares it to a ground-truth about the data, effectively asking an expert “Did I get this right?”

ground truth - guess = error

The difference between the network’s guess and the ground truth is its error. The network measures that error, and walks the error back over its model, adjusting weights to the extent that they contributed to the error.

error * weight's contribution to error = adjustment

The three pseudo-mathematical formulas above account for the three key functions of neural networks: scoring input, calculating loss and applying an update to the model – to begin the three-step process over again. A neural network is a corrective feedback loop, rewarding weights that support its correct guesses, and punishing weights that lead it to error.

(Photo: Google pics)

The interface is complete now…now to run the setup...

A hard disk where All data are stored in binary forms a ram who is continuously processing the system data. the sensors take in external conditions and process it out via the CPU to the output devices…
(Photo: Google pics)


(Photo: Google pics)

Now here we get a BOT who is capable of thinking, collecting data from our mind, work as we want it to, an as well as our memory will be intact in the huge database until we are deleting it by ourselves i.e, it got the same name as of ours, the same way of thinking, same feelings and even if we want we can make it permanent also which will continue with our mind map and all memories even after our death thus making us digitally immortal also.

With this, I am ending my article…Thank you.

Ankan Sinha

Comments

Popular Posts