- A communications channel in which the effects of random influences are negligible -> no random error
- The receiver gets the message from the source without any noise
- They don't exist in the real world (there's always some random error)

PROFIT!

But the souce can't send the message as is, so...

ALL our messages!

- The channel's alphabet ->
`A = {0,1}`

- Our message(s) ->
`S = {a1, a2... an}`

- Our messages' probability distribution ->
`P = {p1... pn}`

, where`pi = P(ai)`

- Our code ->
`C = {c1... cn}`

, where`ci = encoding(ai, A)`

```
```

```
```

```
```# The Huffman method for a binary alphabet

# Huffman method: properties

**Unique codification:** there's no possibility of confusion when decrypting
- There's no error control
- The more probable the message, the shortest the encoded message

- Sort our message by its probability distribution (from higher to lower)
- We group the two less probable messages:
- sum the probabilities
- give a '1' to the less probable message
- give a '0' to the most probable message

- Go back to the first step

Note: you can skip the first step, but sorting the messages is easier

A -> 010, B -> 00, C -> 10, D -> 1110,

E -> 011, F ->1111, G ->110

```
Use
```**spacebar** or the **arrow keys** to navigate