# Matrices in JavaScript: Multi-Dimensional Arrays.

Matrices in JavaScript:

A Multi-dimensional array is an array with more than one level or dimension, it is commonly used to store data for mathematic computations, image processing, and record management.

For example, a normal chess board has 8 rows and 8 columns which can be easily represented using a multi-dimensional array with size 8 by 8 (8 rows, each with the capacity to store 8 elements).

However, JavaScript does not support multidimensional arrays .

Some programming languages like C# and Java let you declare multi-dimensional arrays. But not JavaScript; In JavaScript you have to build them!
Having to build them helps us understand how they really work.

We will be focusing on two-dimensional Arrays in this blog.

Definition Matrices in JavaScript

A two-dimensional array is a collection of items that are organized as a matrix of rows and columns.

If we look closely we can see that the first row of letters is just a normal JS Array, firstRow = [a,b,c]. The second and third rows are two one-dimensional arrays as well!
secRow = [d,e,f]
thirdRow = [g,h,i]

We can think of the columns the same way;

firstColumn = [a,d,g]
and so on…

To put it simply, we can create a matrix by having a one-dimensional Array, where each entry is another one-dimensional Array!

### It’s just an Array of Arrays! … Matrices in JavaScript!

Just as we need indices to reference items of a one-dimensional Array, the same applies for two-dimensional Arrays.
For instance we can access the letter “a” by either specifying firstRow or firstColumn.

Bascially, if we want to reference an item from a 2D Array, it goes as follows:

arrayName[rowIndex][columnIndex].

Let’s code it and see how it works!