Documentation
¶
Index ¶
- func ConditionalEntropy(data [][]int, log lnFunc) []float64
- func ConditionalEntropyBase2(data [][]int) []float64
- func ConditionalEntropyBaseE(data [][]int) []float64
- func ConditionalMutualInformation(xyz [][]int, ln lnFunc) []float64
- func ConditionalMutualInformationBase2(xyz [][]int) []float64
- func ConditionalMutualInformationBaseE(xyz [][]int) []float64
- func Entropy(data []int, ln lnFunc) []float64
- func EntropyBase2(data []int) []float64
- func EntropyBaseE(data []int) []float64
- func MutualInformation(data [][]int, log lnFunc) []float64
- func MutualInformationBase2(data [][]int) []float64
- func MutualInformationBaseE(data [][]int) []float64
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ConditionalEntropy ¶
ConditionalEntropy calculates the conditional entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen
H(X|Y) = -\sum_x p(x,y) lnFunc(p(x,y)/p(y))
func ConditionalEntropyBase2 ¶
ConditionalEntropyBase2 calculates the conditional entropy of a probability distribution in bits
H(X|Y) = -\sum_x p(x,y) log2(p(x,y)/p(y))
func ConditionalEntropyBaseE ¶
ConditionalEntropyBaseE calculates the conditional entropy of a probability distribution in nats
H(X|Y) = -\sum_x p(x,y) ln(p(x,y)/p(y))
func ConditionalMutualInformation ¶
ConditionalMutualInformation calculates the conditional mutual information with the given lnFunc function for each (x_t,y_t,z_t)
I(X_t,Y_t|Z_t) = (lnFunc(p(x,y|z)) - lnFunc(p(x|z)p(y|z)))
func ConditionalMutualInformationBase2 ¶
ConditionalMutualInformationBase2 calculates the conditional mutual information with base 2
I(X,Y|Z) = \sum_x,y, p(x,y,z) (log2(p(x,y|z)) - log2(p(x|z)p(y|z)))
func ConditionalMutualInformationBaseE ¶
ConditionalMutualInformationBaseE calculates the conditional mutual information with base e
I(X,Y|Z) = \sum_x,y, p(x,y,z) (ln(p(x,y|z)) - ln(p(x|z)p(y|z)))
func Entropy ¶
Entropy calculates the entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen
H(X) = -\sum_x p(x) lnFunc(p(x))
func EntropyBase2 ¶
EntropyBase2 calculates the entropy of a probability distribution with base 2
H(X) = -\sum_x p(x) log2(p(x))
func EntropyBaseE ¶
EntropyBaseE calculates the entropy of a probability distribution with base e
H(X) = -\sum_x p(x) ln(p(x))
func MutualInformation ¶
MutualInformation calculates the mutual information for each state with the given lnFunc function
I(X,Y) = \sum_x,y p(x,y) (lnFunc(p(x,y)) - lnFunc(p(x)p(y)))
func MutualInformationBase2 ¶
MutualInformationBase2 calculates the mutual information with for each state with base 2
I(X,Y) = \sum_x,y p(x,y) (log2(p(x,y)) - log2(p(x)p(y)))
Types ¶
This section is empty.
Source Files
¶
- ConditionalEntropy.go
- ConditionalMutualInformation.go
- Entropy.go
- MutualInformation.go
- defs.go