Calculates the mutual information I(X;Y) between two random variables, measuring the reduction in uncertainty of one variable due to knowledge of the other. Mutual information is symmetric, always non-negative, and captures both linear and non-linear dependencies, making it superior to correlation for detecting complex relationships.
Use Cases:
Formula: I(X;Y) = H(X) + H(Y) - H(X,Y) = H(X) - H(X|Y)
Credits: 5 credits per request (Pro Tier) [Tier: ENTERPRISE, Credits: 10]
API key for authentication. Get your key at https://api.fincept.in/auth/register