How Tokenization Works to Establish ‘Reality’
All of us are ‘tokens’ in (our own) ‘reality.’
Tokenization is responsible for human reality. And, also, natural (think: Nature’s) reality. Meaning, basically, there is a circular-linear relationship (mandatory tokenization) between an observer and an observation. This proves everything in Nature (and what humans label ‘reality’) is the tokenization of (a token for) a circle:
Thus, the circular-linear relationship is responsible for tokenization (and vice versa) (thus tokenization (the circular-linear relationship) is required for any (and every) observation) (turning all observations into the same observation):
Thus, we make an observation and we decide what to call it (how to tokenize it), and, also, what to do about it (again, tokenizing it). So, decisioning, is also a circular-linear tokenization (because, technically, everything is a circular-linear tokenization):
Thus, 50–50 is the constant and the norm in decimal ‘reality:’
And (0( 1 )0) is the constant and the norm in binary ‘reality:’
Where it’s not possible to have a half without a whole (decimal without binary) (50–50) (0 (1) 0) (all realities) (any reality):
This means all of us are negotiating, constantly, between a decimal and a binary reality (two ways to tokenize, and, also, then, interpret, ‘reality’):
Where the basis for everything is the arithmetic number ‘two.’
Meaning it’s impossible to have the number ‘one’ without the number ‘two,’ and, always, vice versa (explaining arithmetic, geometry, and algebra) (any mathematical operation, topology, axiom) (everything in ‘technology’):
So, this means the diagram (two circles in a circle) is constant and the interpretation of the diagram is variable. Because the constant in reality is a variable (and the variable in reality is constant). Explaining change, redundancy, and movement, in general (and, also, then, specific) (why ambiguity is involved in every (and any) tokenization):
X and Y
Where it is impossible to move from X to Y without, also, moving from Y to X (again, the variable is constant):
So, now you are, undoubtedly, ‘confused,’ because you are programmed and taught to think in sequence where two comes after one and not the other way around.
However, the diagrams above prove differently.
So, this means you are, always, in your own ‘reality,’ which you create by interpreting your observations. Again, this is, technically, called ‘tokenization.’ Where you tokenize your observations by sorting them, always into two groups (explaining complementary identity). Where it is impossible to have an individual without a group, and, again, always, vice versa.
This is because the number ‘two’ is constant (the only number, technically, in Nature).
So this means any observation you make is balanced by an opposing observation of some kind. Which makes life interesting, and, also, difficult. It’s half-and-half for all of us. At all times. In all places.
This is because underneath it all, we are all tokens, tokenizing, an uber-simple, and always-present, circle. Conservation of the circle to be, technically, correct.
The Number ‘Two’
Where any reality (human) (Nature) is dependent on at-least and at-most one other reality. Thus, for everything in Nature (and for everything that’s human) there are, always, two ‘realites’ (not-one). Two interpretations of ‘reality’ (for sure) (where the word ‘many’ is, always, ‘one’) more technically known as ‘two’) (as shown above, two and two is one and one).
Where negation (whenever we say ‘not) is an obvious form of duplication (tokenization of ‘reality’) (tokenization of a circle) (affirmation of an opposing interpretation) (X-not-X) (0–1–0) (50–50):
This explains why you can take a video of events and there will, always, be two interpretations (two tokenizations) of the events (of the video which is a tokenization of the events).
This means any tokenization (eventual reality) is, always, half-true (despite how we are programmed to think otherwise). True and false (like any X and Y) is a tokenization of a circular-linear reality. Explaining why we get along with each other half-the-time (why we understand each other, half-the-time) (why we get along with, and understand, our ‘selves,’ half-the-time).
And where, if you want to go there, ‘pi’ is the correct tokenization for, what humans label, ‘mind.’
Conservation of a Circle
Conservation of the circle is the core, and, thus, the only dynamic in Nature. Explaining tokenization. And, thus, the establishment of (any) (every) (human) ‘reality.’ All of the possible (and probable) tokenizations within any (human’s) ‘reality.’ Everything (and anything) in Nature’s ‘reality.’ (‘Tokenization’ is, technically, the correct word for ‘reality.’)
Think it through:
Music is one example of the tokenization of a circular-linear reality. Where sound is an infinite circumference (an infinite circle) and light is an infinite diameter (an infinite line) and you need both in order to have either (explaining the circle of fifths, scales, modes, notes, beats, vibrations, repetition, recapitulation, syncopation, modulation, all of these are tokenizations of movement (the conservation of a circle), in general). (The tokenization of reality.)
Video technology is another example:
TikTok an obvious example:
Human tokenization of a circle (conservation of the circle):
Readers are reminded that this material is copyrighted Ilexa Yardley all rights reserved.