state

package
v0.0.0-...-4977366 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 3, 2019 License: MIT Imports: 2 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func ConditionalEntropy

func ConditionalEntropy(data [][]int, log lnFunc) []float64

ConditionalEntropy calculates the conditional entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen

H(X|Y) = -\sum_x p(x,y) lnFunc(p(x,y)/p(y))

func ConditionalEntropyBase2

func ConditionalEntropyBase2(data [][]int) []float64

ConditionalEntropyBase2 calculates the conditional entropy of a probability distribution in bits

H(X|Y) = -\sum_x p(x,y) log2(p(x,y)/p(y))

func ConditionalEntropyBaseE

func ConditionalEntropyBaseE(data [][]int) []float64

ConditionalEntropyBaseE calculates the conditional entropy of a probability distribution in nats

H(X|Y) = -\sum_x p(x,y) ln(p(x,y)/p(y))

func ConditionalMutualInformation

func ConditionalMutualInformation(xyz [][]int, ln lnFunc) []float64

ConditionalMutualInformation calculates the conditional mutual information with the given lnFunc function for each (x_t,y_t,z_t)

I(X_t,Y_t|Z_t) = (lnFunc(p(x,y|z)) - lnFunc(p(x|z)p(y|z)))

func ConditionalMutualInformationBase2

func ConditionalMutualInformationBase2(xyz [][]int) []float64

ConditionalMutualInformationBase2 calculates the conditional mutual information with base 2

I(X,Y|Z) = \sum_x,y, p(x,y,z) (log2(p(x,y|z)) - log2(p(x|z)p(y|z)))

func ConditionalMutualInformationBaseE

func ConditionalMutualInformationBaseE(xyz [][]int) []float64

ConditionalMutualInformationBaseE calculates the conditional mutual information with base e

I(X,Y|Z) = \sum_x,y, p(x,y,z) (ln(p(x,y|z)) - ln(p(x|z)p(y|z)))

func Entropy

func Entropy(data []int, ln lnFunc) []float64

Entropy calculates the entropy of a probability distribution. It takes the log function as an additional parameter, so that the base can be chosen

H(X) = -\sum_x p(x) lnFunc(p(x))

func EntropyBase2

func EntropyBase2(data []int) []float64

EntropyBase2 calculates the entropy of a probability distribution with base 2

H(X) = -\sum_x p(x) log2(p(x))

func EntropyBaseE

func EntropyBaseE(data []int) []float64

EntropyBaseE calculates the entropy of a probability distribution with base e

H(X) = -\sum_x p(x) ln(p(x))

func MutualInformation

func MutualInformation(data [][]int, log lnFunc) []float64

MutualInformation calculates the mutual information for each state with the given lnFunc function

I(X,Y) = \sum_x,y p(x,y) (lnFunc(p(x,y)) - lnFunc(p(x)p(y)))

func MutualInformationBase2

func MutualInformationBase2(data [][]int) []float64

MutualInformationBase2 calculates the mutual information with for each state with base 2

I(X,Y) = \sum_x,y p(x,y) (log2(p(x,y)) - log2(p(x)p(y)))

func MutualInformationBaseE

func MutualInformationBaseE(data [][]int) []float64

MutualInformationBaseE calculates the mutual information for each state with base e

I(X,Y) = \sum_x,y p(x,y) (ln(p(x,y)) - ln(p(x)p(y)))

Types

This section is empty.

Source Files

  • ConditionalEntropy.go
  • ConditionalMutualInformation.go
  • Entropy.go
  • MutualInformation.go
  • defs.go

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL