Serlo: EN: Vector space structure on matrices
{{#invoke:Mathe für Nicht-Freaks/Seite|oben}}
Derivation
Let and let be an -dimensional and an -dimensional -vector space. We have already seen that, after choosing ordered bases, we can represent linear maps from to as matrices. So let be an ordered basis of and be an ordered basis of .
The space of linear maps from to is also a -vector space. The representing matrix of a linear map with respect to the bases and is an -matrix . We will try now transfer the vector space structure of to the space of -matrices over .
So we ask the question: Can we find addition and scalar multiplication on , such that and for all linear maps and all ?
On , is there perhaps even a vector space structure, such that for all finite dimensional vector spaces and and all ordered bases of and of , the mapping is linear?
It is best to think about these questions yourself. There is an exercise for matrix addition and one for scalar multiplication that can help you with this.
A first step is to answer this question is the following theorem: Mathe für Nicht-Freaks: Vorlage:Satz
Vorlage:Anker We would now like to explicitly determine the vector space structure of . Let be a basis of , and a basis of . We define the addition induced by on the space of matrices as in the last theorem: . Now let be arbitrary and be the linear maps associated with and with . Then Vorlage:Einrücken
We now calculate this : In the -th column, must hold. However, by definition of , Vorlage:Einrücken Since the representation of is unique with respect to , it follows that . That is, the addition induced by on is a component-wise addition.
Let us now examine the scalar multiplication induced by . Let again and consider . We have that Vorlage:Einrücken Furthermore we have Vorlage:Einrücken Since we obtain Vorlage:Einrücken Thus, from the uniqueness of the representation it follows that . We see, the scalar multiplication induced from by on is the component-wise scalar multiplication.
We also see here that the induced vector space structure is independent of our choice of and .
Definition
We have just seen: To define a meaningful vector space structure on the matrices, we need to perform the operations component-wise. So we define addition and scalar multiplication as follows:
Mathe für Nicht-Freaks: Vorlage:Definition
Written out explicitly in terms of matrices, this definition looks as follows: Vorlage:Einrücken
Mathe für Nicht-Freaks: Vorlage:Definition
Written out explicitly in terms of matrices, this definition looks as follows: Vorlage:Einrücken
Mathe für Nicht-Freaks: Vorlage:Beispiel Mathe für Nicht-Freaks: Vorlage:Beispiel
Mathe für Nicht-Freaks: Vorlage:Satz
If we consider matrices just as tables of numbers (without considering them as mapping matrices), we see the following: Matrices are nothing more than a special way of writing elements of , since matrices have entries. Just as in , the vector space structure for matrices is defined component-wise. So we get alternatively the following significantly shorter proof:
Mathe für Nicht-Freaks: Vorlage:Alternativer Beweis
Dimension of
By the above identification of with we obtain a canonical basis of : Let be for the matrix with
Mathe für Nicht-Freaks: Vorlage:Beispiel
Thus, is a -dimensional -vector space. We constructed the vector space structure on such that for - and -dimensional vector spaces and with bases and , respectively, we have that the map Vorlage:Einrücken is a linear isomorphism. So is a -dimensional -vector space. This result can also be found in the article vector space of a linear map.
{{#invoke:Mathe für Nicht-Freaks/Seite|unten}}