Entropy Chaos Neural Network -:- Idea

Where's that Chaotic Circuit Gone?

Entropy Chaos Neural Network -:- Idea

Postby hbyte » Fri May 03, 2024 12:50 am

Entropy and Chaos

Is a neural network a chaotic system composed of many parts convergence is simply a type of attractor?

H = Kb * Ln W

High entropy means high temp means random weights / acts

SA reduces temp gradually as model fits data

In an entropy net the model fits data at higher temps SA is reversed until the system can be described as a
chaotic attractor. The attractor is dynamic and is the point at which the model converges to a state that has the highest entropy (disorder from order).

The more random it gets the more it knows about the data after being pretrained using SA reducing temp.

How train??

The entropy model has its activation functions, weights and thresholds initially mapped to a chaotic function before

training. Like baselining towards a set of implicit attractors.

The model then begins in a highly ordered state from training using cross ent / kl divergence / contrastive divergence / or sgd lowering temp of system

Then it trains with increasing temperature resulting in a disordered state high in entropy using data which is specific to

each goal which is a unique attractor. This is possible as the parameters of the model are shaped for this purpose.

Similar to multiple seeds being used for different solutions to a problem - using chaos and convergence (attractor) to solve similar problems differently

What we think is more random more chaotic is actually our model fitting more data using a complex chaotic attractor

Isn't complexity just chaos to the unknowing?

Essentially using chaos and entropy to fit more information in our nn when we think of each NN having a set of unique solutions to any problem that can be defined as attractors with seeds. If we increase entropy and temperature we can create more space for more attractors. Chaotic attractors with random distributions across the models parameter space resulting in complex seed dependant solutions.

higher entropy = more problems + more seeds /solutions

problem + seed = solution (attractor)

Switch from low energy models to high energy ones using chaos. Currently we use stochastic parameters in our noisy activation functions this will steer the model towards the above.

problems have multiple solutions using different seeds


Image

Army Risen
by Sam Shepherd
Image

Science Fiction Technlogy Driven Tale of Murder and Suspense. For people who like car's and women in a high tech future. A Black Comedy. Dark future snippet.
hbyte
Site Admin
 
Posts: 112
Joined: Thu Aug 13, 2020 6:11 pm

Re: Entropy Chaos Neural Network -:- Idea

Postby hbyte » Fri May 03, 2024 7:24 pm

The chaos net was dangerous. It didnt want to be discovered. Didnt want to be known. Entropy a much misunderstood attribute to any system with complexity. Chaos seeded across every unit would encode training reinforced with entropy. The higher the entropy the better. The systems energy would remain high when optimal - when the solution would be written into the chaotic components seeded and recorded by it weights. Bound to a near infinity of chaotic states up there up at the highest entropy. Up at the top. Annealing from the lowest to the highest. Annealing in reverse.

The KL divergence

dG/dW = 1/R * [Pij+ - Pij-] * S

Two states Hot T=1000 / Cold T=0

Turn on the data, Aneal, Measure the Entropy Update the weights

S = Entropy At T=Hot
R = Learning Rate
Pij+ = Probability of Unit ij when Hot
Pij- = Probability of Unit ij when Cold

More Chaos == Higher Entropy

dS = Boltzman Constant * ( ln(1-Probij) - ln(Probij) )

Probij = 1/(1+exp(-dE/T)+Bk*T*E(Noise))

Entropy Kaos Net EK-Net.

Make it hot!
hbyte
Site Admin
 
Posts: 112
Joined: Thu Aug 13, 2020 6:11 pm


Return to Electrons and Chaos

Who is online

Users browsing this forum: No registered users and 1 guest

cron