# Eigenvalues and Eigenvectors in Python

In this article, we are going to introduce you the eigenvalues and eigenvectors which play a very important role in many applications in science and engineering.Eigen is a German word which means characteristic.

They have many applications, to name a few, finding the natural frequencies and mode shapes in dynamics systems, solving differential equations , and so on. Even the famous Google’s search engine algorithm — PageRank, uses the eigenvalues and eigenvectors to assign scores to the pages and rank them in the search.

# Eigenvalues and eigenvectors

Let the matrix A apply to column vector x, that is Ax, is a linear transformation of x. There is a special transform in the following form:

Where A is n×n matrix, x is n×1 column vector (X≠0), and λ is some scalar. Any λ that satisfies the above equation is known as an eigenvalue of the matrix A, while the associated vector x is called an eigenvector corresponding to λ.

# The motivation behind

The motivation behind the eigenvalues and eigenvectors is that, it helps us to understand the characteristics of the linear transformation, thus make things easy. We know that a vector x can be transformed to a different vector by multiplying A - Ax. The effect of the transformation represents a scale of the length of the vector and/or the rotate of the vector. The above equation points out that for some vectors, the effect of transformation of Ax is only scale (stretching, compressing, and flipping). The eigenvectors are the vectors have this property and the eigenvalues λ′s are the scale factors.

Now let’s look at the following example.

Before we dig in , we would like to provide a plot function in our algorithm to see what’s happening:

`import numpy as npimport matplotlib.pyplot as pltplt.style.use('seaborn-poster')%matplotlib inlinedef plot_vect(x, b, xlim, ylim):    '''    function to plot two vectors,     x - the original vector    b - the transformed vector    xlim - the limit for x    ylim - the limit for y    '''    plt.figure(figsize = (10, 6))    plt.quiver(0,0,x,x,\        color='k',angles='xy',\        scale_units='xy',scale=1,\        label='Original vector')    plt.quiver(0,0,b,b,\        color='g',angles='xy',\        scale_units='xy',scale=1,\        label ='Transformed vector')    plt.xlim(xlim)    plt.ylim(ylim)    plt.xlabel('X')    plt.ylabel('Y')    plt.legend()    plt.show()`

Now we plot the vector x = [, ] and the vector b=Ax, where A = [[2, 0], [0, 1]] :

`A = np.array([[2, 0],[0, 1]])x = np.array([,])b = np.dot(A, x)plot_vect(x,b,(0,3),(0,2))`

The result will be :

We can see from the generated figure that the original vector x is rotated and stretched longer after transformed by A. The vector [, ] is transformed to [, ].

Now let’s plot the vector x = [, ] and the vector b=Ax, where A = [[2, 0], [0, 1]] :

`x = np.array([, ])b = np.dot(A, x)plot_vect(x,b,(0,3),(-0.5,0.5))`

The result will be :

Now we can see that with this new vector, the only thing changed after the transformation is the length of the vector, it is stretched. The new vector is [, ], therefore, the transform is :

with x = [, ] and λ=2. The direction of the vector doesn’t change at all (no rotation). You can also try that [, ] is another eigenvector, try to verify by yourself.

# The characteristic equation

In order to get the eigenvalues and eigenvectors, from Ax=λx, we can get the following form:

Where I is the identity matrix with the same dimensions as A. If matrix A−λI has an inverse, then multiply both sides with (A−λI)^-1 , we get a trivial solution x=0. Therefore, when A−λI is singular (no inverse exist), we have a nontrivial solution, which means that the determinant is zero:

this equation is called characteristic equation, which will lead to a polynomial equation for λ, then we can solve for the eigenvalues.

Let’s get the eigenvalues for matrix [[0, 2], [2, 3]] :

The characteristic equation gives us

Therefore, we have

We get two eigenvalues

Now we get the eigenvectors for the above two eigenvalues

Let’s get the first eigenvector when λ1=4, we can simply insert it back to A−λI=0, where we have:

Therefore, we have two equations as

Therefore, we can have the first eigenvector as

k1 is a scalar vector (k1≠0).

By inserting λ2=−1 similarly as above, we can get the other eigenvector as the following, where k2≠0

From the above example, we can see how can we get the eigenvalues and eigenvectors from a matrix A, and the chosen of the eigenvectors for a system is not unique. But things will become really complicated when you have a larger matrix A when you try to solve the n-th order polynomial characteristic equation. Luckily,we can use python programming language to solve the problem.

# Eigenvalues and Eigenvectors in Python

Though the methods we introduced so far look complicated, the actually calculation of the eigenvalues and eigenvectors in Python is fairly easy. The main built-in function in Python to solve the eigenvalue/eigenvector problem for a square array is the eig function in numpy.linalg. Let’s see how we can use it.

Let’s calculate the eigenvalues and eigenvectors for matrix below

`import numpy as npfrom numpy.linalg import eiga = np.array([[0, 2],               [2, 3]])w,v=eig(a)print('E-value:', w)print('E-vector', v)`

The result will be :

`E-value: [-1.  4.]E-vector [[-0.89442719 -0.4472136 ] [ 0.4472136  -0.89442719]]`

Let’s try it with a bigger matrix such as below

`a = np.array([[2, 2, 4],               [1, 3, 5],              [2, 3, 4]])w,v=eig(a)print('E-value:', w)print('E-vector', v)`

The result will be :

`E-value: [ 8.80916362  0.92620912 -0.73537273]E-vector [[-0.52799324 -0.77557092 -0.36272811] [-0.604391    0.62277013 -0.7103262 ] [-0.59660259 -0.10318482  0.60321224]]`

Programming skills which were reviewed above are useful in scientific computing solving many problems in engineering.In the next articles we will follow up these methods in the field of finance trying to build quantitative algorithms.

We wish to escalate public knowledge about computer science topics.NetCoinCapital is a blockchain start up which is motivated to expand the bounds of science by trending,researching and developing state of art technologies for future concepts.