1
$\begingroup$

I'm not too familiar with random matrix theory so I cannot find a suitable reference for this question.

Consider a set of matrices$\{A_i\}_{i=1}^k\subseteq M_{d\times d}$ over the complex field and some random variables$\{c_i\}_{i=1}^k$, each with a (complex) Gaussian distribution.

The commutant of a set of matrices$\{X_i\}$ is defined as the set

$$ \operatorname{comm} (\{X_i\}) := \left\{ C \in M_{d \times d} \mid \left[ C,\, X_i \right] = 0, \forall i \right\} $$

Is there some reasonable definition of probability over matrices such that the statement "Consider the random matrix$A = \sum_{i=1}^k c_i A_i$, then$\operatorname{comm} (A) = \operatorname{comm} (\{A_i\}$ with high probability". Intuition and some examples tells me that this is the case, but I have not performed a large numerical study.

Is the above statement true in some reasonable setting of random matrices? Can you please provide a proof or a reference?

I would also expect an argument related to the zeroes of a polynomial to support this statement, but I have not been able to figure it out.

Rodrigo de Azevedo's user avatar
Rodrigo de Azevedo
23.6k7 gold badges49 silver badges117 bronze badges
askedOct 24 at 17:23
Another User's user avatar
$\endgroup$
2
  • $\begingroup$Is $\operatorname{comm} (A) := \operatorname{comm} (\{A\})$?$\endgroup$CommentedOct 25 at 13:19
  • $\begingroup$Yes, it is the singleton set$\endgroup$CommentedOct 28 at 14:10

1 Answer1

1
$\begingroup$

This is false. A generic matrix$A$ has the property that$\text{comm}(A)$ is exactly the subalgebra$\text{span}(1, A, A^2, \dots)$ generated by$A$; a sufficient condition is that$A$ has distinct eigenvalues, in which case$\text{comm}(A)$ can also be characterized as the set of matrices with the same eigenvectors as$A$.

So you can see there will be an issue if the$A_i$ don't commute. Explicitly we can take, for example,

$$A_1 = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}, A_2 = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}.$$

$\text{comm}(A_1)$ is the subalgebra of matrices of the form$\begin{bmatrix} a & 0 \\ 0 & b \end{bmatrix}$ while$\text{comm}(A_2)$ is the subalgebra of matrices of the form$\begin{bmatrix} a & b \\ b & a \end{bmatrix}$. Their intersection$\text{comm}(\{A_1, A_2\})$ is then the subalgebra$\begin{bmatrix} a & 0 \\ 0 & a \end{bmatrix}$ generated by the identity. But$\text{comm}(c_1 A_1 + c_2 A_2)$ will always be strictly larger than this (unless$c_1 = c_2 = 0$) since it will contain the matrix$c_1 A_1 + c_2 A_2$ itself, which is never a scalar multiple of the identity (unless it's zero).

answeredOct 24 at 18:05
Qiaochu Yuan's user avatar
$\endgroup$
3
  • $\begingroup$That is why I was wondering about some sense of average/probability. The reason why I was giving that statement related more to 'sampling'. Say you first sample a certain number of matrices $A$ with different parameters $\{c_i\}$, and then compute the commutant. If I now 'take the average' of the various commutants (if this makes any sense) I would expect the 'average' commutant to be the same as that corresponding to the single $\{A_i\}$.$\endgroup$CommentedOct 28 at 14:14
  • $\begingroup$@Another: I don't know what you mean by taking the "average" of the commutants.$\endgroup$CommentedOct 29 at 0:23
  • $\begingroup$Yeah, me either. Intersection could work, but it is too strong of an operation. I guess the question is not well posed.$\endgroup$CommentedOct 29 at 9:03

You mustlog in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.