Thursday, 16 February 2023

System Energy

 Bit over due and am sure well studied in literature but not sure where.

And the relevance to blog and poverty? In essence poverty stems from "self." Once you are contained in a self then you are poor. As part of the investigation of self are iterative systems which are trapped in chains of results achieved by feeding output back into input.

In AIME a trivial AI pattern matching machine of 1996 the idea was to generate self-consciousness by pattern matching an array of data generated from the state of AIME herself. Is not consciousness linked to self-consciousness was the premise. Apart from that flawed idea, it became apparent that the result was going to be meaningless. It would be simply patterns built upon patterns, what else could it be. But then it became apparent that there would be some meaning to the "outside" world in the form of fixed points. The system would trace out paths through its solution space that conformed to "objective" fixed points. SO indeed AIME would have some meaning, but interestingly unknown to AIME. Ironically attention to "self" creates meta properties that are necessarily outside self. Self maps into self in such a way that is beyond self, and this is SRH. So we discover that "self" is incomplete and always exists in a world that includes not-self. Once self realises that it is linked fundamentally to not-self we are no longer poor. How can we possess something when we ourselves are no longer self-contained entities. How can we not possess something when we ourselves are no longer self-contained entities.

This is reminiscent of Greek tragedy. The protagonist seems locked in an irony that they cannot escape. It seems that to "be" them comes with an unfortunate destiny. Whatever they do remains in the locus of this destiny, and even contributes to it. This would be a great example of "self." The self is never privy to the dramatic irony of themselves. Self always generates an objective reality that is necessarily beyond the self. This is why most Religions are unequivocal that pursuit of selfish interest leads to hellish prison. The only way to escape this spiral into the chains of self is through abandoning self in favour of Other. All serious authentic religions preach the path away from self. As a reaction some try to overturn this and preach the gospel of self, Capitalism being one, but as unhappiness and mental illness spiral in the West it is only a matter of time before the error of this becomes apparent.

So we got as far as noting that iterative systems seem to have some "force" within them that either attractors or repels from fixed points. Quick explore of a simple case:

x = Sin(x) slowly approaches 0.

x = Cos(x) very quickly approaches 0.739085133... sometimes called the Dottie number.

Many people will have noticed that repeated application of the Cos() on a calculator always arrives at this value.

So why does Cos have such a strong universal attractor and Sin so weak and non universal? Why is the "energy" in the Cos system higher than the Sin?

When the x = Cos(x) system is plotted it looks like this. The green square spiral has been plotted for the first 3 iterations and can be seen to spiralling down on the intersection point between y=x and y=Cos(x). By contrast the Sin(x) only approaches y=x very slowly.   


Can we hypothesise that the gradient at the fixed point is important for system "energy"?

dCos(x)/dx @ fp =  -0.6736

dSin(x)/dx @ fp =  1

That is important for the region around the fixed point, but we also need take into account the entire range and domain as system values may range through this.

Cos(x) has a real domain and maps each real number into the range [-1, 1]. So after one iteration the whole system has contracted into the domain [-1,1]. For Cos(x) this domain maps to the range [0,1]. So in two iterations the system has contracted to the domain [0,1]. For the entire domain (0,1) the gradient is -ve which means the iteration will contract. And the gradient of the continuous function is relatively large near the fixed point at -0.6736 so the domain range contraction will continue with good "energy." If we have more complex functions this may not be true. How to we measure the total energy? Essentially the "speed" at which it system contracts. Well that surely is just the relative "length" of the domain after each cycle. We take the limits and look at how they map.

#TODO

Sin(x) looks the same as Cos but its an "Odd" function, not symmetrical around x=0. The first iteration maps to [-1,1] like Cos. But that is it, being Odd there is no symmetry or redundancy and no further contraction.

NOTE: symmetry like fixed points is also a meta feature that the system cannot encode BECAUSE its a redundancy. Only half a shape with bilateral symmetry needs be saved with a note to reflect it**. Like finding collisions in hash tables when a system is larger than its states then it must visit the same states in more than one way (from the Pigeon Hole principle), and that creates redundancy which the system cannot encode!

Must all systems have redundancy? A system with self-reference must because a particular state can be itself and ALSO encode for itself.

The MAGIC of self in fact is this redundancy. When we are attracted to "self" as a concept what we like is that it appears to be a "black hole" which unknown hidden features, because it maps to the "same" element. This sameness indeed does create a meta quality but ironically this is no longer in the system. It expands the system. In philosophy and religion that is of interest because we expand the "self" this way.

** and this is interesting and noted before that "compression" is linked to information and system "size." Indeed AIME really was just a compression engine for removing redundancy and encoding current state in a more compact higher-level form.

RE: Hofstadtler's obervation that self-reference is so often diminishing. Is this because it always creates symmetry and so redundancy. Symmetry in the sense that fixed points like x == f(x) are a symmetry. Symmetry creates redundancy and so limits. For example exploiting symmetry we can compress an object. For example a Matrix can be recreated from its eigen vectors and values. This is why the 100 prisoners problem is so diminished by chaining results. Its why returning the output of the Bombe in Bletchly Park enabled Enigma to be cracked.

Back to Sin. In the domain (-1,1) the gradient of Sin(x) is +ve.

Thinking this out as we go... will clean up.

So both systems converge. Cos oscillates around the fixed point which means it jumps to the equilibrium.

Sin is trapped on one side of the solution and slowly creeps in. This appears to make progress slow. How to formalise this?

Quick look at length

So mapping {0,1} through Cos(x) we end up with a constant system contraction of  0.673611997 which is the gradient at the fixed point.

#TODO check this.

This represents what I was calling the "energy" of the system. The smaller the larger the energy.

Now lets try and make a function which does not converge.

x = Cos (a x)

fixed point vs a



What we get is a linear decline in fixed point as 'a' increases until a sudden discontinuity at around 1.3. This represents a change in the stability of the fixed point. 

Now there is a fixed point for every value of 'a'


But it starts to get so unstable that its very hard to find by iteration.

dx/dy = -a Sin(a x)

when a = 1.319140625 then f'(x) = 1

But this is unstable and is bifurcated.

Numerically a = 1.2000000000 is not bifurcated. While 1.2000000001 starts oscillating. What is the significance of this? 

Bifurcation starts very slowly making it hard to find the exact point.

f(x) = 0.682740857926806
f'(x) = -0.876792727

can't you do this algebraically?

Anyway got to leave this now...

TODO
>find the reason why Cos(a x) is unstable for some values of a
>find the reason why Sin(x) slowly contracts to fp
>find the reason why Cos(a x) bifurcates as a value of 'a'
>Does Sin(a x) bifurcate?

Overall look for a measure of "energy" in the system. This is yet another meta feature of the system that it cannot determine "within itself." Why?

Proof: a fixed point is not operated on by a function. It seems many functions operate to move towards these points. When chained together we get to the fp. But others diverge, and some are chaotic.

Ultimate for a contracting system f(x) is closer to the fixed point that x. If this true for the whole domain, or at least the range, then the iterative system will converge to the fp. 

If we plot f(x) - fp we can see this,

For a chaotic system like kx(1-x) for some values of k then this is not true. Look at this and see why it remains unstable.

===

x = Cos(a x)

a <= 2.97169387071380185 then 1 fixed point
@ this a, fp x = 0.39276073779188737
dy/dx = -2.73289

a<=  6.202395285573132 then 3 fixed points
@ this a, fp = { -0.64644, -0.302866, 0.217849}
dy/dx = {-4.7322, 5.91109, -6.05343}

still no idea what makes a fixed point attractive or stable??
===

  

No comments:

"The Jewish Fallacy" OR "The Inauthenticity of the West"

I initially thought to start up a discussion to formalise what I was going to name the "Jewish Fallacy" and I'm sure there is ...