Today we will introduce the Frobenius maps over \(\F_p\) and apply them to help show that fields of all sizes \(p^n\) for prime \(p\) exist and to modify the binary Hamming codes to produce a new class of \(2\)-error correcting codes.
Since \(p\) is prime, \(\binom{p}{n}\) is always a multiple of \(p\) for \(1\leq n \leq p-1\text{.}\) So when we expand \((\alpha\pm \beta)^p\) using the binomial theorem, all of the terms except for the first and last are zero, giving us the desired result.
is a field automorphism of \(F\text{,}\) i.e. it is a map from \(F\) to itself that preserves field addition, field multiplication, and has an inverse function. Such a map \(f_m\) is called a Frobenius mapping.
By the previous problem, \(f_m\) preserves field addition. It also preserves field multiplication since \((\alpha\beta)^{p^m}=\alpha^{p^m}\beta^{p^m}\text{.}\) Finally, if \(|F|=p^n\) then we have \(f_{m}^{-1}=f_{n-m}(\alpha)=\alpha^{p^{n-m}}\) since
where we are treating the columns of the original parity check matrix (vectors in \(\F_2^4\)) as elements of the field \(\GF(2^4)\) (and rearranging them for simplicity).
Show that when we regard the columns of \(H\) in this way that choosing \(f(\alpha)=c\alpha\) for some constant \(c\) does not improve decoding capability by considering the syndrome of a word with two errors.
In this case the right hand side of the equations are dependent since \(f(\alpha_i)+f(\alpha_j)=c\alpha_i+c\alpha_j=c(\alpha_i+\alpha_j)=cs_1\text{.}\) So we can only decode if \(s_2=cs_1\)and this means we do not get new information from the added rows.
In this case, \(f(\alpha_i)+f(\alpha_j)=\alpha_i^2+\alpha_j^2\text{.}\) Again, this is dependent on \(s_1\) since \(\alpha_i^2+\alpha_j^2=(\alpha_i+\alpha_j)^2=s_1^2\text{.}\) So we can only decode if \(s_2=s_1^2\) and this means we do not get new information from the added rows.
Now we will show that the choice \(f(\alpha)=\alpha^3\) does produce an improvement by working out a decoding algorithm for a received message with two or less errors.
One error: what is the syndrome produced when a message with one error is received? What nice relation is there between \(s_1\) and \(s_2\) in this case, and how does it let us decode?
If there is one error in position \(i\text{,}\) then the syndrome is \(\bs=H^i=\begin{pmatrix} \alpha_i \\ \alpha_i^3\end{pmatrix}\text{.}\) So we have the relation \(s_1^3=s_2\text{.}\) We can decode by finding the unique column \(i\in\{1,2,\ldots,15\}\) such that \(\bs=H^i\) and flipping the bit in position \(i\text{.}\)
Using the order of elements of \(\GF(2^4)\) with \(1,\alpha,\alpha^2,\ldots \) where \(\alpha^4+\alpha+1=0\text{,}\) write our new \(H\) first as a \(2\times 15\) matrix over \(\GF(2^4)\) and then as a \(4\times 15\) binary matrix. See separate field table.