calculate entropy in swift

Entropy can be calculated in Swift by first calculating the probability distribution of a given set of symbols and then computing the entropy of the distribution.

Here's an example function that takes in an array of symbols and outputs the entropy using the Shannon entropy formula:

main.swift
func entropy(of symbols: [String]) -> Double {
    let totalCount = symbols.count
    let unorderedFrequencyDict = symbols.reduce(into: [:]) { dict, symbol in
        dict[symbol, default: 0] += 1
    }
    let frequencyDict = unorderedFrequencyDict.sorted(by: { $0.key < $1.key })
    let probabilities = frequencyDict.map { Double($0.value) / Double(totalCount) }
    let entropy = probabilities.reduce(into: 0) { result, probability in
        result -= probability * log2(probability)
    }
    return entropy
}
516 chars
13 lines

The function first calculates the frequency of each symbol in the input array, and then computes the probability distribution of those symbols. Finally, it computes the Shannon entropy of the distribution by summing the products of the probabilities and their corresponding binary logarithms.

Here's an example usage:

main.swift
let symbols = ["A", "B", "A", "C", "B", "B", "A"]
let entropyValue = entropy(of: symbols)
print("Entropy: \(entropyValue)") // prints "Entropy: 1.5219280948873621"
164 chars
4 lines

In this example, the input symbols are ["A", "B", "A", "C", "B", "B", "A"] and the calculated entropy is 1.5219.

gistlibby LogSnag