Skip to main content

Worksheet Frobenius Maps & a 2-Error Correcting Code

Today we will introduce the Frobenius maps over \(\F_p\) and apply them to help show that fields of all sizes \(p^n\) for prime \(p\) exist and to modify the binary Hamming codes to produce a new class of \(2\)-error correcting codes.

1.

Use properties of binomial coefficients to show that if \(\char(F)=p\) then
\begin{equation*} (\alpha\pm \beta)^p=\alpha^p \pm \beta^p \end{equation*}
for any \(\alpha,\beta \in F\text{.}\)
Solution.
Since \(p\) is prime, \(\binom{p}{n}\) is always a multiple of \(p\) for \(1\leq n \leq p-1\text{.}\) So when we expand \((\alpha\pm \beta)^p\) using the binomial theorem, all of the terms except for the first and last are zero, giving us the desired result.

2.

Explain why the previous problem shows that if \(\char(F)=p\) then
\begin{equation*} (\alpha\pm \beta)^{p^m}=\alpha^{p^m} \pm \beta^{p^m} \end{equation*}
for any \(\alpha,\beta \in F\text{.}\)
Solution.
This is the result of applying the previous problem \(m\) times.

3.

Explain why the previous problem shows that if \(\char(F)=p\) then the function \(f_m:F\to F\) defined by
\begin{equation*} f_m(\alpha)=\alpha^{p^m} \end{equation*}
is a field automorphism of \(F\text{,}\) i.e. it is a map from \(F\) to itself that preserves field addition, field multiplication, and has an inverse function. Such a map \(f_m\) is called a Frobenius mapping.
Solution.
By the previous problem, \(f_m\) preserves field addition. It also preserves field multiplication since \((\alpha\beta)^{p^m}=\alpha^{p^m}\beta^{p^m}\text{.}\) Finally, if \(|F|=p^n\) then we have \(f_{m}^{-1}=f_{n-m}(\alpha)=\alpha^{p^{n-m}}\) since
\begin{equation*} f_m(f_{n-m}(\alpha))=f_m(\alpha^{p^{n-m}})=(\alpha^{p^{n-m}})^{p^m}=\alpha^{p^n}=\alpha\text{.} \end{equation*}
Throughout the rest of the activity we will be considering our new construction of a code from the Hamming code with
\begin{equation*} H = \begin{bmatrix} \alpha \amp \alpha_2 \amp \ldots \amp \alpha_{15}\\ f(\alpha) \amp f(\alpha_2) \amp \ldots \amp f(\alpha_{15}) \end{bmatrix} \end{equation*}
where we are treating the columns of the original parity check matrix (vectors in \(\F_2^4\)) as elements of the field \(\GF(2^4)\) (and rearranging them for simplicity).

4.

Show that when we regard the columns of \(H\) in this way that choosing \(f(\alpha)=c\alpha\) for some constant \(c\) does not improve decoding capability by considering the syndrome of a word with two errors.
\begin{equation*} \begin{pmatrix} s_1 \\ s_2\end{pmatrix}=\bs=H^i+H^j= \begin{pmatrix} \alpha_i \\ f(\alpha_i)\end{pmatrix}+ \begin{pmatrix} \alpha_j\\ f(\alpha_j)\end{pmatrix}\text{.} \end{equation*}
In other words, show that for this choice of \(f(\alpha)\) the system
\begin{equation*} \begin{cases} s_1 \amp = \alpha_i+\alpha_j \\ s_2\amp = f(\alpha_i)+f(\alpha_j) \end{cases} \end{equation*}
generally does not have a solution.
Solution.
In this case the right hand side of the equations are dependent since \(f(\alpha_i)+f(\alpha_j)=c\alpha_i+c\alpha_j=c(\alpha_i+\alpha_j)=cs_1\text{.}\) So we can only decode if \(s_2=cs_1\)and this means we do not get new information from the added rows.

5.

Repeat this for the choice \(f(\alpha)=\alpha^2\text{.}\)
Solution.
In this case, \(f(\alpha_i)+f(\alpha_j)=\alpha_i^2+\alpha_j^2\text{.}\) Again, this is dependent on \(s_1\) since \(\alpha_i^2+\alpha_j^2=(\alpha_i+\alpha_j)^2=s_1^2\text{.}\) So we can only decode if \(s_2=s_1^2\) and this means we do not get new information from the added rows.
Now we will show that the choice \(f(\alpha)=\alpha^3\) does produce an improvement by working out a decoding algorithm for a received message with two or less errors.

6.

The easy case: what is the syndrome produced when a message with no errors is received, and how do we decode?
Solution.
If there are no errors, then \(\by\) is a codeword and so \(H\transpose{\by}=\bs=0\text{.}\) We output \(\by\) as the decoded codeword.

7.

One error: what is the syndrome produced when a message with one error is received? What nice relation is there between \(s_1\) and \(s_2\) in this case, and how does it let us decode?
Solution.
If there is one error in position \(i\text{,}\) then the syndrome is \(\bs=H^i=\begin{pmatrix} \alpha_i \\ \alpha_i^3\end{pmatrix}\text{.}\) So we have the relation \(s_1^3=s_2\text{.}\) We can decode by finding the unique column \(i\in\{1,2,\ldots,15\}\) such that \(\bs=H^i\) and flipping the bit in position \(i\text{.}\)

8.

The hard case. If we receive a message with two errors, we need to solve the system
\begin{equation*} \begin{cases} s_1 \amp = \alpha_i+\alpha_j \\ s_2\amp = \alpha_i^3+\alpha_j^3 \text{.}\end{cases} \end{equation*}
You know \(s_1\neq 0\) (why?). So this is equivalent to solving
\begin{equation*} \frac{s_2}{s_1} = \frac{\alpha_i^3+\alpha_j^3}{\alpha_i+\alpha_j} = \alpha_i^2+\alpha_i\alpha_j+\alpha_j^2 \text{.} \end{equation*}
Show that all of this means that \(\alpha_i\) and \(\alpha_j\) are the roots of the quadratic equation
\begin{equation*} x^2+s_1x+\left( \frac{s_2}{s_1}+s_1^2 \right)\text{.} \end{equation*}
Solution.
We have
\begin{align*} \frac{s_2}{s_1} \amp= \frac{\alpha_i^3+\alpha_j^3}{\alpha_i+\alpha_j} \\ \amp = \alpha_i^2+\alpha_i\alpha_j+\alpha_j^2 \\ \amp = (\alpha_i+\alpha_j)^2+\alpha_i\alpha_j\\ \amp = s_1^2+\alpha_i\alpha_j\text{.} \end{align*}
So \(\alpha_i\alpha_j=\frac{s_2}{s_1}+s_1^2\text{.}\) Thus
\begin{equation*} (x+\alpha_i)(x+\alpha_j)=x^2+(\alpha_i+\alpha_j)x+(\alpha_i \alpha_j) = x^2+s_1x+\left( \frac{s_2}{s_1}+s_1^2 \right)\text{.} \end{equation*}

9.

Using the order of elements of \(\GF(2^4)\) with \(1,\alpha,\alpha^2,\ldots \) where \(\alpha^4+\alpha+1=0\text{,}\) write our new \(H\) first as a \(2\times 15\) matrix over \(\GF(2^4)\) and then as a \(4\times 15\) binary matrix. See separate field table.
Solution.
This is your weekly practice assignment.

10.

Find the locations of any errors that occurred if the syndrome is \(\bs_1=\transpose{(0111\, 0110)}\) or \(\bs_2=\transpose{(0101\, 1111)}\)
Solution.
This is your weekly practice assignment.

11.

What do our results so far say about the parameters of the code with parity check matrix \(H\text{?}\)
Solution.
It has length \(15\text{,}\) dimension at least \(15-8=7\text{,}\) and distance at least \(2(2)+1=5\text{.}\)
Table 63. Representation of \(\GF(2^4)\) as \(\F_2[\alpha]/(\alpha^4+\alpha+1)\)
Power of \(\alpha\) Field element Coordinates Zech’s log correspondence \(1+\alpha^i\)
\(\alpha^{-\infty}\) \(0\) \(0000\) \(1+\alpha^{0}\)
\(\alpha^{0}\) \(1 \) \(1000\) \(1+\alpha^{-\infty}\)
\(\alpha^{1}\) \(\alpha \) \(0100\) \(1+\alpha^{4}\)
\(\alpha^{2}\) \(\alpha^2 \) \(0010\) \(1+\alpha^{8}\)
\(\alpha^{3}\) \(\alpha^3 \) \(0001\) \(1+\alpha^{14}\)
\(\alpha^{4}\) \(1+\alpha \) \(1100\) \(1+\alpha^{1}\)
\(\alpha^{5}\) \(\alpha+\alpha^2 \) \(0110\) \(1+\alpha^{10}\)
\(\alpha^{6}\) \(\alpha^2+\alpha^3 \) \(0011\) \(1+\alpha^{13}\)
\(\alpha^{7}\) \(1+\alpha+\alpha^3 \) \(1101\) \(1+\alpha^{9}\)
\(\alpha^{8}\) \(1+\alpha^2\) \(1010\) \(1+\alpha^{2}\)
\(\alpha^{9}\) \(\alpha+\alpha^3 \) \(0101\) \(1+\alpha^{7}\)
\(\alpha^{10}\) \(1+\alpha+\alpha^2 \) \(1110\) \(1+\alpha^{5}\)
\(\alpha^{11}\) \(\alpha+\alpha^2+\alpha^3 \) \(0111\) \(1+\alpha^{12}\)
\(\alpha^{12}\) \(1+\alpha+\alpha^2+\alpha^3 \) \(1111\) \(1+\alpha^{11}\)
\(\alpha^{13}\) \(1+\alpha^2+\alpha^3 \) \(1011\) \(1+\alpha^{6}\)
\(\alpha^{14}\) \(1+\alpha^3 \) \(1001\) \(1+\alpha^{3}\)