Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
3.6k views
in Matrices by (48.7k points)
closed by

If possible, using elementary row transformations, find the inverse of the matrices.

\(\begin{bmatrix} 2 & 0 & -1 \\[0.3em] 5 & 1 & 0 \\[0.3em] 0 & 1 & 3 \end{bmatrix}\)

1 Answer

+1 vote
by (50.2k points)
selected by
 
Best answer

Let A = \(\begin{bmatrix} 2 & 0 & -1 \\[0.3em] 5 & 1 & 0 \\[0.3em] 0 & 1 & 3 \end{bmatrix}\)

To apply elementary row transformations we write:

A = IA where I is the identity matrix

We proceed with operations in such a way that LHS becomes I and the transformations in I give us a new matrix such that

I = XA

And this X is called inverse of A = A-1

Note: Never apply row and column transformations simultaneously over a matrix.

So we have:

Applying R2→ R2 - 5R3

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

...