0% found this document useful (0 votes)
52 views6 pages

Blockchain Tehnologije EN

Uploaded by

hey yey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views6 pages

Blockchain Tehnologije EN

Uploaded by

hey yey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Possibilities of using Blockchain technology in the Unique

Information System of the MOI of the Republic of Serbia


Vojkan Nikolic1, Nikola Stranjina2, Stefan Tosic3
1
Ministry of Interior of the Republic of Serbia, KnezaMilosa 101, 11000 Belgrade, Serbia and University
of Criminal Investigation and Police Studies, Cara Dušana 196,11080 Zemun, Serbia
2
University of Criminal Investigation and Police Studies, Cara Dušana 196,11080 Zemun, Serbia
[email protected]

Abstract: Inspired by the brain, deep neural networks (DNN) are thought to learn abstract representations through
their hierarchical architecture. However, at present, how this happens is not well understood. Here, we demonstrate
that DNN learn abstract representations by a process of demodulation. We introduce a biased sigmoid activation
function and use it to show that DNN learn and perform better when optimized for demodulation. Our findings
constitute the first unambiguous evidence that DNN perform abstract learning in practical use. Our findings may
also explain abstract learning in the human brain.

Index terms—Deep learning, abstract representation, neural networks, demodulation.


Key words:Neural Networks, Deep learning,Nural Networks Training

1.INTRODUCTION
Deep neural networks (DNN) are state of the art for many machine learning problems [1], [2],
[3], [4], [5], [6]. The architecture of deep neural networks is inspired by the hierarchical structure of the
brain [7]. Deep neural networks feature a hierarchical, layer-wise arrangement of nonlinear activation
functions (neurons) fed by inputs scaled by linear weights (synapses). Since the brain is adept at
abstraction, it is anticipated that deep neural architecture might somehow capture abstract representations.
However, it is not presently known how (or even if) abstract learning occurs in a DNN. One possible way
to engineer the process of abstraction is known as demodulation. Consider a 1 kHz sinusoidal carrier
signal that is multiplied with a 10 Hz sinusoidal modulation signal. This shapes the envelope of the carrier
signal into a representation of the modulation signal, and this allows the slower modulation signal to pass
through a band-pass circuit that would not otherwise support the low frequency information. In this case,
the abstract information (i.e., about the 1 kHz carrier) is that the carrier is modulated at 10 Hz.
Recovery of the modulation signal is achieved via a nonlinear demodulation operation.
In this paper, we take a sampling theory perspective [8] and we interpret the nonlinear activation function
of the DNN as a demodulation device within the context of the well-known MNIST [4] hand-writte
character classification problem.
2. NEURAL NETWORK APPLICATION 1

Basically, the first application is an easier example of a neural network that is not too complicated and
easy to understand. In building this application, we used two python libraries, namely (numpy and
matplotlib)

1.Step

The code starts by defining the input members that we need to work or so (inputs) using a string, then
immediately defining the output data also with the string.

2.Step

After that we defined a class called Neural Network, then in that class we defined variables and
previously done outputs and inputs, also in the same class we defined weights that can be changed further,
depending on what we would like to count.
3.Step
After that a function graph was drawn up which will eject the function graph at the end of the program
execution.

4.Step

It was then inserted at what speed the Neural Network would develop, also inserted an update on the
neural network weight.
5.Step

Then we did a train function where we determined how much our neural network would have iterated,
which is how many times we wanted to train it.

6.Step
Након тога је урађена функција за предвиђање излазних података.

7.Step

We then created a section where we write down prediction examples that can always be
modifed

8.Step
After that we create a function to print the previous part and finally we have
functions that will plot the graph.
Finally, if we want to run the application before running the application in python, we need to install two
files that we need to run the program at all: (numpy and matplotlib). Both are installed the same way: pip
install numpy the same goes for the other. After that the program starts and what we have entered will be
printed and also a graph with that.

NEURAL NETWORK APPLICATION 2 (GAME)

In second application we have made Flappy Bird game, the toy is essentially that birds can be added cross
prefref and reach level levels. First of all the first thing we did was find a thumbnail that would be useful
in competing players and use them in a sedan folder from which they could learn to list them. Then we
created a text document called config, and we installed in it all the data and weights that the game would
work on. After that we created a file that will be used to start the game. First of all, we imported multiple
files (pygame, random, os, time, neat, visualize, pickle), then we determined in which resolution the
application and fonts would fire, and we also inserted pictures without which the game could not. we
created a Bird class in which we put a bird's rotation and an image and also an animation. After that a
class was created for JUMP and its functions and after that a MOVE class was created which serves to
move the bird. After that, a draw class used for various image animations. After that, there are a few more
classes that are not as essential to the program's operation but to the aesthetics themselves. This game
requires some more files to be started before it starts (numpy, pygame, neat-python, graphviz, matplotlib).
This example will show one trained and the other not trained to tell a neural network to spot the
differences.
6.0 CONCLUSION

In this paper we have provided evidence that deep neural networks perform abstract learning through a
process of demodulation that is sensitive to asymmetry in the data. We have introduced a biased sigmoid
activation function that is capable of improved learning and performance. We have shown that the
optimum bias point in a practical model matches well that of the idealized demodulation example and that
the optimally biased model is fundamentally superior to the traditional sigmoid activated model. These
results have broad implications for how deep neural networks are interpreted, designed and understood.
Furthermore, our findings may provide insight into the exceptional abstract learning capabilities of the
human brain.

LITERATURE:

[1]http://solair.eunet.yu/ilicv/neuro.html (07.03.2019)
[2]http://sr.wikipedia.org/sr-el/neuronske_mreze (07.03.2019)
[3]https://towardsdatascience.com/inroduction-to-neural-networks-in-python-7e0b422e6c24
[4]https://www.python-course.eu/neural_networks_with_python_numpy.php (07.03.2019)
[5]https://towardsdatascience.com/how-to-build-youin- python-68998a08e4f6
[6]https://stackabuse.com/creating-a-neural-network-from-scratch-in-python/ (07.03.2019)

You might also like