Education

What is the difference between Matrices and Linear Algebra?

Linear algebra is an extremely important part of mathematics and finds wide scale application in many domains such as engineering, science, statistic, physics etc. In linear algebra, one is concerned with algebraic equations that are linear in nature. That is, the power of the independent variable ‘x’ is limited to one, irrespective of the number of independent variables in the equation. A common representation for a linear equation is given as follows,

m1x1 + m2x2 +…… + mixi=p

Linear algebra s also fundamental for linear mapping and its illustration in vector space through the use of matrices. In general, it can be said that linear algebra is essential for almost every branch of mathematics. For example, linear algebra plays a key role in the current understanding and study of geometry, inclusive of definition of rudimentary objects like planes, lines etc. The beauty of linear algebra matrices is that is easily fits into any branch of science because it is the simplest and yet powerful way of representing anything phenomenon through the use of mathematical model. For systems which are nonlinear in nature, it is difficult to model t through linear algebra but here too, first order approximations are done considering the use of linear mapping.

Matrices, on the other hand, is a representation of a rectangular array of numbers, notations or expressions, which are arranged in columns and rows. Matrices are used to illustrate any mathematical object or to highlight the property of a mathematical function. An example of a matrix is as follows

The above is a matrix having two rows and two columns, and thus is read as two by two matrix, or a matrix having dimension of 2×2.Matrices is an integral part of linear algebra, because matrices allow the representation of linear maps, thereby permitting the computation in an easier manner in linear algebra. However, it must be noted that not every matrix is connected to linear algebra. This is especially true for the case of graph theory. Square matrices, (as is the example above) are matrices having the same number of rows and columns. These matrices play an important role in matrix theory. Square matrices of a given dimension make a noncommutative ring. The determinant of any given square matrix is a number linked to the matrix. This number is instrumental in the study of a square matrix. Its importance can be understood by an example that, a square matrix is invertible if and only if the determinant of this square matrix is a non-zero value and the eigenvalues of the same square matrix are the roots of a polynomial determinant. In geometry, matrices find a widespread use for laying down and demonstrating geometric transformations and change of coordinates. In numerical analysis also, matrix method of computation is a very useful tool by reducing the computational problems to matrix form. Matrices are used in many areas of mathematics and other scientific fields, either in direct form, or through its use in numerical analysis and geometry.

Matrices are basically objects that ae used to hold data, and are called upon for operation with each other in specific manner. One can say that matrices are a kind of category. Some assortments of matrices have even more structure, for example, such as being a group. Linear Algebra is much more than meaningfully how to add and multiply matrices with one another. Now, the question is why do matrices find such wide-scale use in Linear Algebra? The simple answer to this is because matrices are a suitable framework to work in when from the knowledge of mathematics, we already know that any linear map from a finite dimensional vector space to alternative finite dimensional vector space can be denoted by a matrix.

Matrix theory is considered to be as a specialization of linear algebra to the point of finite dimensional vector spaces and it is useful in doing explicit manipulations, only after fixing a basis. More precisely, this can be understood as the algebra of n×n matrices with coefficients defined in a field called F is isomorphic to the algebra of F-linear homomorphism’s from an n-dimensional vector space V spread over the same field F, to itself, and this choice of such an isomorphism is precisely the choice of a basis for V. Sometimes one needs solid definite calculations for which one uses the help of matrix. But for the indulgence on a conceptual level, application to wider contexts and also for complete mathematical simplicity, the abstract approach of vector spaces and linear transformations has been found to be more suitable. In this second approach one can take over linear algebra to guide to a more general settings such as modules over rings (PIDs for instance), analysis on a functional level, homological algebra, representation theory, etc.

 

Can you solve this question: If A=[(1 ,0, 0),(0, 1,0),( a,b, -1)] , then A^2 is equal to (a) a null matrix (b) a unit matrix (c) A (d) A

 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button