Documente Academic
Documente Profesional
Documente Cultură
f = T/2
f = p(1-p)
Static probability is the independent variable
that characterizes the signal
All other properties can be derived from the
static probability
E.g. Avg. length of consecutive 1 in 0-1 signal.
i consecutive 1s by a 0 is pi(1-p) thus
Conditional Prob. and frequency
Current state depends on the preceding state
Consider tossing of 2 coins. If 0 then toss coin0. If
1 then toss coin1 for the current state.
Define p01 (p11) the conditional Prob. That the
current state is 1 given the previous state is 0(1)
Also define p00 (p10) the conditional Prob. That the
current state is 1 given the previous state is 0(1)
Let p1 be the static Prob. Of Logic 1.
p0 = 1-p1
For time homogeneous 0-1 sequence
We have six variables p01, p00, p11, p10, p1, p0.
Not all are independent.
There are only 2 independent variables
Other variables can be derived from the 2
independent variables
Partition the variables into sets {p01, p00}, {p11,
p10}, {p1, p0}.
Chosing any two variabels from 2 different
sets the other variables can be found
Transition Prob. T
Word-Level and Bit-Level Statistics
In DSP systems the statistical behavoiur of the
data stream is known. E.g. data rate, data stream,
sample correlation
Word level data stream statistics can provide Arch.
Level power estimation
In general :
The static proabability pi of bit i is the sum of
of the probabilities of all the words that has bi
= 1 in its binary representation. i.e.
Probabilistic Power Analysis
Boolean Difference:
Output transition density:
1996 onwards
Basics of Entropy
Suppose a system has m possible events {E1,Em}
Where each Ei occurs with pi
and p1+p2+...+pm = 1
Information content Ci of an event Ei is given by
= 2 {1 }
pi 0 , Ci
pi 1, Ci 0
Unit of Ci is referred to as bit
Entropy H of the system is H =
=1 2 1/
H(X) = 2 1/
=0
For large n , computing H(X) is difficult
Assume each signal sj is independent of the
other and has static probabilty pj
Each bit is an independent discrete event system
with probabilities {pj, (1-pj)}
Therefore
Max{H(X)} = log2m = n bits when each i is
equally likely and pj = 0.5
In general H(X) is low when pj 0.5
Power Estimation with Entropy
Consider
Therefore
Proportionality constant depends on
Operation voltage
Average node capacitance
H(X), H(y) are obtained by monitoring the
signals during a high level simulation
In general H(Y) H(X), o/ps are fewer than
i/ps
For all combinational gates, H(Y) H(X)
Drawbacks of Entropy Estimation method
Based on empirically observed average
phenomenon. Specific classes of circuits dont
obey the result
Absolute accuracy of power estimation is poor
due to lack of implementation details.
Only applicable to large combinational circuits
with high degree of randomess
Power estimation fails for circuits with high
regularity or signals with high correlation
Extension to Sequential Circuits
From high level simulation obtain i/p and o/p entropies of the
combinational circuits.
Apply entropy power estimation for the combinational circuits
and add to the power dissipation of the sequential elements