Fingerprinting & Modeling of Systems Using Aditya’s Exponential Probability Trees
In this article I will explain to you…
- How do we Fingerprint a System?
- How do we Model a System?
- What is Aditya’s Exponential Probability Tree?
- Integrating AEPT’s with Deep Learning
- Integrating AEPT’s in Data Analysis
- A Couple of Use Cases
Phew! Thats quite a lot. Lets get started!…
Lets say we have a coin. Which is unbiased. Which means the probability of Heads or Tails is 0.5
Let there be an experiment
In which we flip the coin and observe the outcome Heads or Tails
If I ask you what is the probability(of the outcome of the flip = Heads). You will say 0.5 or 50%. End of Story!
But there is a huge problem with all this…
We started in the first place with the exact design assumptions of the system (i.e. flipping a coin). We knew it was unbiased and we knew implicitly that it would be heads or tails with 50% probability.
Now lets define another experiment
We have a System (which is a black box) whenever we ask it for an outcome it simply tells us Heads or Tails. And we ask it ’n’ times e.g. n = 100 so we ask it for an outcome 100 times
We absolutely don’t know anything else about the system in this experiment.
But for our readers, to get an intuitive understanding of whats going on, we will tell you the secret that this system inside the black box is basically flipping coins to give the outcome. And the coin is unbiased, hence the probability of the outcome being heads or tails is 0.5 or 50%. But thats our little secret. But in the real world we don’t have the luxury of being god’s and knowing the internal details of all systems we are interested in. So all this is just between you and me. Please keep this in your mind as we proceed further.
Now we will do repeat this experiment ‘o’ times e.g. o= 1000 and call this a ‘set’ of experiments.
If N is the number of times we get heads (out of 100 tries in each experiment) and we plot it on the X-axis
Now lets do these sets of experiments ‘p’ times e.g. p = 200
And we will try to see how each Probability of (N= constant) where constant = 0–100 varies
for example for N (Number of heads = 40) we will get some curve similar to
Lets understand this this way in Figure 1 the probability of N being 40 heads is less than 0.5 lets say it is approximately ~0.27
In figure 2 we are saying if we do the set of experiments p = 200 times. We find that we get 40 heads with the probabilities shown in figure 2.
We are now deep into calculating probability of probabilities… or recursively calculating probabilities, of real world systems.
If we recursively calculate Probability of Probability of … of Outcome = Heads=0.5 . They will all become ~1
Lets understand it this way.
What is the Probability of outcome of a flip being Heads? 0.5
What is the Probability of ( Probability of outcome of a flip being Heads = 0.5)? = 1
Which in simple english means the probability of probability of the outcome of a flip being Heads being 0.5 or 50% is 1.0 or 100% (Certainty)
At such a depth of recursive calculations/analysis our system fingerprint is complete. And we don’t need to proceed any further or any deeper.
This recursive tree of probabilities is called Aditya’s Exponential Probability Tree (AEPT)
Lets say you have a machine on the factory floor and you are measuring its temperature every minute. And we get thousands of observations throughout the day.
We decide that our window is 100 seconds == 100 observations.
Now we can calculate the AEPT. This AEPT is the fingerprint of the system.
Modeling a System
Lets say earlier we modeled the vibrations of a system based on its temperature and motor speed.
If we feed the AETP values also into the model as parameters we get an extended model of the system.
Whats the Benefit?
AETP is a ‘complete’ thermal or motor speed fingerprint of the system. Hence the system model now though becomes more complex, it becomes literally ~100% accurate.
Integration w/ Deep Learning?
Without getting into the exact details of CNN’s, RNN’s, LSTM’s etc.
If we think of AETP as a Pre-Processing Step in Deep Neural Networks. And feed the AETP values as parameters into the Neural Net’s. We will successfully extend Deep Neural Networks with AETP.
Considering all the values in AETP’s are in the range of 0.0–1.0 the integration with DNN’s is very smooth.
Some of the most naive analysis of systems was done using the concept of process limits. Which was basically a threshold or boundary of min & max values of some parameter.
Trying to make predictions based on such things didn’t work particularly well. But with AEPT’s a lot of complexity becomes simplified.
There is a machine with produces ball bearings with a radius 20mm.
10% of the times it makes ones with 18mm
Now we can simply say if the probability of 18mm ball bearings goes above 14% then halt the machine and realign the laser or change the laser driver motor. (or whatever)
Audio, Video, Speech, Vision, Images are very natural use cases of extended modeling with AEPT’s
All kinds of Physical Systems
In some cases we can also apply this to Cyber Systems Monitoring and Security also.
This is our website http://automatski.com