Mathematicians love finding connections between different fields, but particularly between fields and linear algebra, because we know a lot about linear algebra.

I don’t know of too many instances where thinking of a matrix as a graph led to a surprising proof (but I am curious if you do), but studying graphs via linear algebra is a basic cornerstone of graph theory. Textbooks about this can be found by searching “algebraic graph theory”.

I don’t know anything about any of it. :) I do lots of meta-research. In this case, there’s a lot of work that went and is going into acceleration of matrix operations and (esp recently) graph operations. There’s even graph databases these days. So, anything connecting a concept getting a lot of attention to another concept in a lot of use might be worth posting in case it gives people ideas.

It’s usually the other way around. There is the field called spectral graph theory, which is amost entirely focused on the graph laplacian (a matrix made from the adjacency matrix and degree matrix) and uses it to construct proofs around the eigendecomposition.

One example would be spectral clustering, where the eigenvectors of the laplacian are used to embed the nodes in a two- or three-dimensional space. There is also a random walk laplacian, which models the transition probabilities between the different vertices in a graph. Those are just some examples where matrices are used to analyze graphs and there are many more possible applications.

Physics does something similar. If you can make something a harmonic oscillator (mass bouncing on a spring), you’ve hit the jackpot because it’s an incredibly simple and well-studied model system.

Mathematicians love finding connections between different fields, but particularly between fields and linear algebra, because we know a lot about linear algebra.

I don’t know of too many instances where thinking of a matrix as a graph led to a surprising proof (but I am curious if you do), but studying graphs via linear algebra is a basic cornerstone of graph theory. Textbooks about this can be found by searching “algebraic graph theory”.

I don’t know anything about any of it. :) I do lots of meta-research. In this case, there’s a lot of work that went and is going into acceleration of matrix operations and (esp recently) graph operations. There’s even graph databases these days. So, anything connecting a concept getting a lot of attention to another concept in a lot of use might be worth posting in case it gives people ideas.

It’s usually the other way around. There is the field called spectral graph theory, which is amost entirely focused on the graph laplacian (a matrix made from the adjacency matrix and degree matrix) and uses it to construct proofs around the eigendecomposition.

One example would be spectral clustering, where the eigenvectors of the laplacian are used to embed the nodes in a two- or three-dimensional space. There is also a random walk laplacian, which models the transition probabilities between the different vertices in a graph. Those are just some examples where matrices are used to analyze graphs and there are many more possible applications.

If you can find a mapping between different fields for your problem you can use the other’s fields proofs to help solve yours.

Physics does something similar. If you can make something a harmonic oscillator (mass bouncing on a spring), you’ve hit the jackpot because it’s an incredibly simple and well-studied model system.