Search results

Name
The Symbol Frequency Table Calculator
The Symbol Frequency Table Calculator is a tool that takes in a text message and produces a table of the frequencies of each symbol (i.e., character) that appears in the message. The calculator can be customized to ignore spaces and/or case, allowing for a more accurate analysis of the message's symbol frequencies.
Shannon Entropy
This online calculator computes Shannon entropy for a given event probability table and for a given message.
Huffman coding
This online calculator generates Huffman coding based on a set of symbols and their probabilities. A brief description of Huffman coding is below the calculator.
Wordpress plugin
With Planetcalc plugin for Wordpress you can easily insert any planetcalc calculator in your wordpress-based site or blog pages.
Hartley Information Calculator
The Hartley Information Calculator determines the amount of information contained in a message with length n using Hartley's formula.
Text compression using Huffman coding
This online calculator compresses entered text using Huffman coding. It also displays built Huffman codes for the reference.
Shannon Coding Calculator
Generate Shannon coding for a set of symbols based on their probabilities.
Shannon-Fano coding calculator
This online calculator generates Shannon-Fano coding based on a set of symbols and their probabilities
Conditional entropy
This online calculator calculates entropy of Y random variable conditioned on X random variable and X random variable conditioned on Y random variable given a joint distribution table (X, Y) ~ p
Specific Conditional Entropy
This online calculator calculates entropy of Y random variable conditioned on specific value of X random variable and X random variable conditioned on specific value of Y random variable given a joint distribution table (X, Y) ~ p
Joint Entropy
This online calculator calculates joint entropy of two discrete random variables given a joint distribution table (X, Y) ~ p
Decision tree builder
This online calculator builds a decision tree from a training set using the Information Gain metric
Items per page: