Matrix multiplication and adjacent matrix

Matrix multiplication and adjacency matrix

Proof of the associative law of matrix multiplication\(:\)
\[\begin{aligned}((\mathbf{AB}) \mathbf{C})[i, j] & \\ &=\sum_{l=1 }^{c}\left(\sum_{k=1}^{b} \mathbf{A}[i, k] \mathbf{B}[k, l]\right) \mathbf{C}[l, j] \\ &=\sum_{k=1}^{b} \sum_{l=1}^{c} \mathbf{A}[i, k] \mathbf{B}[k, l] \mathbf {C}[l, j] \\ &=\sum_{k=1}^{b} \mathbf{A}[i, k]\left(\sum_{l=1}^{c} \mathbf{ B}[k, l] \mathbf{C}[l, j]\right) \\ &=(\mathbf{A}(\mathbf{B} \mathbf{C}))[i, j] \end {aligned}\]

The reason why matrix multiplication can perform fast exponentiation is because it has associative law.

quote \(1:\) [TJOI2017]Coke

I believe many people can come up with a \(\ Theta(t\times m)\). (Although I didn’t figure it out, it’s just because of my cooking)

The problem is simplified, if we don’t stay in place and explode. This is an operation, then it is to ask the number of different paths to take \(t\) from the starting point.

How to do this question?

p>

Do not consider \(Dp\) .

Let the graph’s adjacency matrix be \ (G\), then we consider \(G^ 2\) What is it. (The power operation here refers to the power of the matrix).

We individually consider the related operations of a certain row and a certain column\(:\) Let it be \(G_{a,i}\) and \(G_{ i,b)\), let \(G’\) be the matrix obtained by multiplying, then there will be \(:\)

\[G’_{a,b} = \sum_{i=1}^m{G_{a, i}\times G_{i,b}}\]

Easy to find, if and only if \(G_{a,i}\) and \(G_{i,b}\) are not zero, that is, \(i\) point can be connected to \(a,b\) when two points are used, the item of the above formula is \(1\) , otherwise it is zero.

Then all these situations add up, from \(a\) to \(b\) The number of paths whose length is \(2\). (i.e. take \(2\) step from \(a\) to \(b\) The number of programs, the length is \(2\) because it passes through an intermediate point.)

From this, we can In order to get, the matrix obtained by \(G^2\) actually indicates that the length between any two points is \(2\)< /span> The number of paths.

Whether \(G^3\) means that the length between any two points is \(3\) The number of paths?

Let\(G’=G^2\) , \(G”\) is \(G^3\). Then there is:

\[G”=G’\times G\]

\[G” _{a,b}=\sum_{i=1}^n\sum_{j=1}^n{G_{a,i}\times G_{i,j}\times G_{j,b}}\ ]

The analysis method is the same as above, so we summarize the conclusions as follows:

\(G\)< /span> means the adjacency matrix representation of a graph, then \(G^i\) means that the length between any two points is \ (i\) The number of paths.

Then we have solved the simplification problem of the citation.

Then how to deal with the self-destruction and the original Is the ground not moving?

It’s very simple, staying in place is regarded as a self-loop, and self-explosion will create an additional virtual point to indicate self-destruction. It should be noted here that there is no need to connect back to the original image from the virtual point , Because after the self-destruction, we can’t go anymore.

So we solved the quoted example.

So is the moment multiplication only useful for this one?

Quotation Example\(2:\) U SACO07NOV Cow Relays

The main idea of ​​the title\(:\) Seek from \(s\) To the shortest path of \(t\) through \(k\) edges.

This question is very familiar at first glance. It seems that the last question is transformed in the details.

But if you think about it carefully, you will find that the shortest path, the seemingly trivial condition, can’t use addition and multiplication. Solve.

But in fact, this is also reasonable, because we know that the shortest path is based on the relaxation operation similar to \(Dp\) Yes, that is to say, there is a core operation \(: min!\)

So can we use a matrix to solve this operation?

Considering the process of \(Floyd\), the core code is \(f_{i,j}=min(f_ {i,j},f_{i,k}+f_{k,j})\)

This gave us some inspiration, because The process of \(Floyd\) is very similar to the process of matrix multiplication. (\(Floyd\) The essence is to roll off the one-dimensional three-dimensional \(Dp\))

So, we boldly define a new matrix \(:\)

Make the matrix \(A\) multiply the matrix \(B\) The result is a matrix \(C\) .

Then define:

\[C_{a,b}=\sum_{i=1}^m{min(A_{x,i},B_{i,y})}\]< /p>

It is easy to find that this moment multiplication also has an associative law. (You can calculate from \(min\) and \(+\) The operation has the same nature of binary operators, and the proof is the same as ordinary moment multiplication).

So in this way, we directly apply the example \(1\) The conclusion in can solve the problem.

quote\( 3:\) The problem of the minimum and maximum sides

The title is not found, the national collection of papers did not give the source of the topic, and I can’t find it.

The problem of the minimum and maximum sides\(:\) Given a directed graph, find the number of edges passing between two points exactly as \(k\) The path of span> makes the largest side the smallest.

Same familiarity, same problem.

Consider if there is no length that happens to be \(k \), then replace the core code of \(Floyd\) with \(:\)< /span>
\[f_{i,j}=max(f_{i,j},min(f_{i,k},f_{k,j})) \]

Can we redefine the moment multiplication in the same way as above? The answer is yes.

Let the matrix \(A\) and matrix \(B\) The result of multiplication is matrix \(C\)< /span>.

Then define \(:\ )

\[C_{a,b}=\max_{i=1}^m\{min(A_{x,i}, B_{i,y})\}\]

Just apply the above conclusion directly.

References\( :\) 2008 ACM Paper: Application of Matrix Multiplication in Informatics–Yu Huacheng

Leave a Comment

Your email address will not be published.