By definition, products of matrices multiplication is possible only if the number of columns of the first multiplier is equal to the number of rows of the second. Therefore the row vector will be able to multiply only a matrix which has the same number of rows as elements in the vector-string. Similarly, a column vector can be multiplied only by a matrix in which the same number of columns as elements in a vector-column.
Matrix multiplication non-commutative, that is, if A and B are matrices, then A*B ≠ B*A. moreover, the existence of the product of A*B does not guarantee the existence of a work B*A. for Example, if the matrix A is 3 x 4 and matrix B is 4*5, then the product A*B is matrix of size 3*5 and B*A is not defined.
Let set: a row vector A = [a1, a2, a3 ... an] and the matrix B of dimension n*m, which elements are equal:
[b11, b12, b13, ... b1m;
b21, b22, b23, ... b2m;
bn1, bn2, bn3, ... bnm].
Then the product A*B is the vector-row of dimension 1*m, where each element is equal to:

Cj = ∑ai*bij (i = 1 ... n, j = 1 ... m).

In other words, for finding the ith element of the works needed to multiply each element in a vector string into the corresponding order element of the ith column of the matrix and the sum of these works.
Similarly, if you are given matrix A of dimension m*n and a column vector B of dimension n*1, then their product is a column vector of dimension m*1, i-th element which is equal to the sum of the products of elements of the vector-column B for the corresponding elements of the ith row of the matrix A.
If A is a row vector of dimension 1*n, and B is a column vector of dimension n*1, then the product A*B is a number equal to the sum of the products of corresponding elements of vectors:

c = ∑ai*bi (i = 1 ... n).

This number is called the scalar, or inner product.
The result of the multiplication B*A is a square matrix of dimension n*n. Its elements are equal:

Cij = ai*bj (i = 1 ... n, j = 1 ... n).

Such a matrix is called outer product of vectors.