Eigen values and eigen vectors for the square matrix

The following function computes eigen values and eigen vectors for square matrix. But I don’t understand what is W?

[V,LAMBDA,W] = eig(A) where A is the square matrix.

V is right eigen vector(:thinking: ) of square matrix A.

And I think W is also an left eigen vector(:thinking: :roll_eyes: ) of square matrix A.

I get in octave A × V = V × LAMBDA, but I didn’t get A × W = W × LAMBDA or W × A = W × LAMBDA. where A × V = matrix multiplication

How is that?
:see_no_evil: :hear_no_evil: :speak_no_evil:

IIUC, you’ll need to transpose the matrix with the left Eigen vectors:

A = rand (2);
[v, lambda, w] = eig (A);

abs (A * v - v * lambda) < eps ()
abs (w' * A - lambda * w') < eps ()

The last two equations should result in a 2x2 true matrix (maybe with some higher tolerance depending on the random input).

Some examples in the docstring might indeed be helpful.

1 Like