A New Method an Enhancement on Neural Cryptography with Multiple Transfers Functions and Learning Rules
Author(s)
N. Prabakaran , P. Loganathan and P. Vivekanandan
Published Date
September 11, 2024
DOI
your-doi-here
Volume / Issue
Vol. 4 / Issue 3
Abstract
The goal of any cryptographic system is the exchange of information among the intended users. We can generate a common secret key using neural networks and cryptography. Neural cryptography is based on a competition between attractive and repulsive stochastic forces. A feedback mechanism is added to neural cryptography which increases the repulsive forces. The partners A and B have to use a cryptographic key exchange protocol in order to generate a common secret key over the public channel. This can be achieved by two Tree Parity Machines (TPMs). In the proposed TPMs, each output vectors are compared, then updates from hidden unit using Hebbian Learning Rule, left- dynamic hidden unit using Random Walk Rule and right- dynamic hidden unit using Anti-Hebbian Rule with feedback mechanism. We can enhance the security of the system using different learning rules with different units. A network with feedback generates a pseudorandom bit sequence which can be used to encrypt and decrypt a secret message. In this paper, the most successful attack on neural cryptography is the majority flipping attack, which is presented here.
View Full Article
Download or view the complete article PDF published by the author.