1. Doubly-Extended GRS Codes.
Let \(C\) be a linear \([n,k=n-r,d]\) code over \(F\) defined by a parity check matrix
\begin{equation*}
H=\begin{bmatrix}
1 \amp 1 \amp \ldots \amp 1 \amp 0 \\
\alpha_1 \amp \alpha_2 \amp \ldots \amp \alpha_{n-1} \amp 0 \\
\vdots \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\
\alpha_1^{r-2} \amp \alpha_2^{r-2} \amp \ldots \amp \alpha_{n-1}^{r-2} \amp 0 \\
\alpha_1^{r-1} \amp \alpha_2^{r-1} \amp \ldots \amp \alpha_{n-1}^{r-1} \amp 1
\end{bmatrix}\begin{bmatrix}
v_1 \amp 0 \amp \ldots \amp 0 \\
0 \amp v_2 \amp \ldots \amp 0 \\
\vdots \amp \vdots \amp \ddots \amp \vdots \\
0 \amp 0 \amp\ldots \amp v_{n}
\end{bmatrix}
\end{equation*}
where \(\alpha_1,\alpha_2,\dots,\alpha_{n-1}\) are distinct elements of \(F\) and \(v_1,v_2,\dots,v_n\) are nonzero elements of \(F\text{.}\) The code \(C\) is called a doubly-extended GRS code, and the last column in \(H\) is said to correspond to the code locator \(\infty\text{,}\) i.e., the βinfinityβ element or βpoint at infinityβ.
(a)
Show that \(C\) is MDS.
Solution.
To show that \(C\) is an MDS code, we must show that its minimum distance is \(d = n - k + 1 = r + 1\text{.}\) By the Singleton bound, we know \(d \le r + 1\text{.}\) It therefore suffices to show that any \(r\) columns of the parity check matrix \(H\) are linearly independent.
Let us factor \(H = H' V\text{,}\) where \(V\) is the \(n \times n\) diagonal matrix with nonzero diagonal entries \(v_1, \dots, v_n\text{,}\) and \(H'\) is the \(r \times n\) matrix:
\begin{equation*}
H' = \begin{bmatrix}
1 \amp 1 \amp \ldots \amp 1 \amp 0 \\
\alpha_1 \amp \alpha_2 \amp \ldots \amp \alpha_{n-1} \amp 0 \\
\vdots \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\
\alpha_1^{r-2} \amp \alpha_2^{r-2} \amp \ldots \amp \alpha_{n-1}^{r-2} \amp 0 \\
\alpha_1^{r-1} \amp \alpha_2^{r-1} \amp \ldots \amp \alpha_{n-1}^{r-1} \amp 1
\end{bmatrix}
\end{equation*}
Because right-multiplying by a non-singular diagonal matrix \(V\) merely scales the columns by nonzero constants, it preserves linear independence. Thus, we only need to show that any \(r\) columns of \(H'\) are linearly independent.
Case 1: We select \(r\) columns from the first \(n-1\) columns. These columns form an \(r \times r\) matrix that is a square Vandermonde matrix constructed from \(r\) distinct elements \(\alpha_{i_1}, \dots, \alpha_{i_r}\text{.}\) Because the elements \(\alpha_i\) are distinct, the determinant is nonzero, meaning the columns are linearly independent.
Case 2: We select the last column (column \(n\)) and \(r-1\) columns from the first \(n-1\) columns. Let these correspond to locators \(\alpha_{i_1}, \dots, \alpha_{i_{r-1}}\text{.}\) The resulting \(r \times r\) submatrix \(M\) is:
\begin{equation*}
M = \begin{bmatrix}
1 \amp \ldots \amp 1 \amp 0 \\
\alpha_{i_1} \amp \ldots \amp \alpha_{i_{r-1}} \amp 0 \\
\vdots \amp \ddots \amp \vdots \amp \vdots \\
\alpha_{i_1}^{r-2} \amp \ldots \amp \alpha_{i_{r-1}}^{r-2} \amp 0 \\
\alpha_{i_1}^{r-1} \amp \ldots \amp \alpha_{i_{r-1}}^{r-1} \amp 1
\end{bmatrix}
\end{equation*}
Expanding the determinant along the last column gives \(\det(M) = 1 \cdot \det(V_{r-1})\text{,}\) where \(V_{r-1}\) is the \((r-1) \times (r-1)\) Vandermonde matrix on the distinct elements \(\alpha_{i_1}, \dots, \alpha_{i_{r-1}}\text{.}\) Since the elements are distinct, \(\det(M) \neq 0\text{,}\) so these columns are also linearly independent.
Because any \(r\) columns of \(H\) are linearly independent, the code \(C\) has minimum distance at least \(r+1 = n-k+1\text{.}\) Thus, \(C\) is MDS.
(b)
Assuming that \(k\lt n\text{,}\) show that the dual code \(C^{\perp}\) is an \([n,n-k]\) doubly-extended GRS code that can be defined through the same code locators as \(C\text{.}\)
Solution.
The dual code \(C^\perp\) has dimension \(r = n-k\) and its generator matrix is exactly the parity check matrix of \(C\text{,}\) \(H\text{.}\) We want to show that \(C^\perp\) is a doubly-extended GRS code, which means it must have a parity check matrix \(H^\perp\) of the exact same form as \(H\text{,}\) but of dimension \(k \times n\text{,}\) using the identical locators \(\alpha_1, \dots, \alpha_{n-1}, \infty\) and some set of nonzero column multipliers \(w_1, \dots, w_n\text{.}\)
Let \(H^\perp\) be defined as:
\begin{equation*}
H^\perp = \begin{bmatrix}
1 \amp 1 \amp \ldots \amp 1 \amp 0 \\
\alpha_1 \amp \alpha_2 \amp \ldots \amp \alpha_{n-1} \amp 0 \\
\vdots \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\
\alpha_1^{k-2} \amp \alpha_2^{k-2} \amp \ldots \amp \alpha_{n-1}^{k-2} \amp 0 \\
\alpha_1^{k-1} \amp \alpha_2^{k-1} \amp \ldots \amp \alpha_{n-1}^{k-1} \amp 1
\end{bmatrix} \begin{bmatrix}
w_1 \amp 0 \amp \ldots \amp 0 \\
0 \amp w_2 \amp \ldots \amp 0 \\
\vdots \amp \vdots \amp \ddots \amp \vdots \\
0 \amp 0 \amp \ldots \amp w_n
\end{bmatrix}
\end{equation*}
For \(H^\perp\) to be a valid parity check matrix of \(C^\perp\text{,}\) we require \(H^\perp H^\top = 0\text{.}\) The \((i, j)\)-th entry of \(H^\perp H^\top\) for \(0 \leq i \leq k-1\) and \(0 \leq j \leq r-1\) is given by the dot product of the \(i\)-th row of \(H^\perp\) and the \(j\)-th row of \(H\text{.}\)
By the result we proved for generalized Reed-Solomon codes on \(n-1\) points, there exist nonzero multipliers \(u_1, \dots, u_{n-1}\) such that \(\sum_{l=1}^{n-1} u_lv_l \alpha_l^m = 0\) for all \(0 \leq m \leq n-3\text{.}\) Let us select \(w_l = u_l\) for \(1 \leq l \leq n-1\text{.}\)
If \((i, j) \neq (k-1, r-1)\text{,}\) the exponent of \(\alpha\) in the dot product satisfies \(i + j \leq (k - 1) + (r - 1) - 1 \leq n - 3\text{.}\) The dot product evaluates to:
\begin{equation*}
\sum_{l=1}^{n-1} (w_l v_l) \alpha_l^{i+j} + (\text{last column product}) = \sum_{l=1}^{n-1} u_lv_l \alpha_l^{i+j} + 0 = 0.
\end{equation*}
For the single remaining case where \(i = k-1\) and \(j = r-1\text{,}\) the dot product is:
\begin{equation*}
\sum_{l=1}^{n-1} (w_l v_l) \alpha_l^{(k-1)+(r-1)} + w_n v_n (1)(1) = \sum_{l=1}^{n-1} u_lv_l \alpha_l^{n-2} + w_n v_n.
\end{equation*}
Setting this equal to zero forces \(w_n = -v_n^{-1} \sum_{l=1}^{n-1} u_lv_l \alpha_l^{n-2}\text{.}\) It suffices to show that the sum \(\sum_{l=1}^{n-1} u_lv_l \alpha_l^{n-2}\) must be nonzero so that the final coefficient \(w_n\) is also nonzero.
Suppose to the contrary that this is zero. Then (together with our assumption about \(\vec{u}\) initially) we have
\begin{equation*}
\begin{bmatrix}
1 \amp 1 \amp \ldots \amp 1 \\
\alpha_1 \amp \alpha_2 \amp \ldots \amp \alpha_{n-1} \\
\vdots \amp \vdots \amp \vdots \amp \vdots \amp \vdots \\
\alpha_1^{n-2} \amp \alpha_2^{n-2} \amp \ldots \amp \alpha_{n-1}^{n-2} \\
\end{bmatrix} \begin{bmatrix}
v_1 \amp 0 \amp \ldots \amp 0 \\
0 \amp v_2 \amp \ldots \amp 0 \\
\vdots \amp \vdots \amp \ddots \amp \vdots \\
0 \amp 0 \amp \ldots \amp v_{n-1}
\end{bmatrix}\begin{bmatrix} u_1 \\ u_2 \\ \vdots \\ u_{n-1}
\end{bmatrix}=\begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix}
\end{equation*}
But the left two matrices are a Vandermonde matrix and a diagonal matrix with no non-zero entries and so both are invertible. Therefore the vector \(\vec{u}\) is the zero vector, which contradicts that each \(u_i\) is nonzero. So \(w_n\) is also nonzero.
Since all \(w_l\) are nonzero, \(H^\perp\) is a legitimate parity check matrix exhibiting the doubly-extended GRS layout while retaining the identical code locators. Therefore, \(C^\perp\) is an \([n, n-k]\) doubly-extended GRS code.
