Ok, first the Einstein summation convention.
In an expression that has the same index raised and lowered, a summation is assumed over the index. For example if we take the "dot product" of two vectors [math]\vec{A}[/math] and [math]\vec{B}[/math] in Euclidean space
[math]\sum_iA^iB_i=A^iB_i[/math], where i denotes a spacial index (as it does in SR)
Here the indices are the same and one is raised and one lowered. These are however not summations over the index
[math]A^{\alpha}B^{\alpha}[/math] [math]A_{\alpha}B_{\alpha}[/math] [math] A^{\alpha}B_{i}[/math]
etc.
Vector componants have their index raised and the i-th basis vectors has its index lowered. One-form componants have a lowered index and the i-th basis one-form has its index raised.
I shall digress slightly before discussing the metric to describe some aspects of tensor algebra. The type of tensor is usually demonstrated as a 1x2 matrix, the numbers on the top denote the number of covarient components and the number on the bottom denotes the number of contravarient, the number is the rank. A tensor of type (M N) is a linear mapping of M contravariant vectors (one-forms) and N covariant vectors (vectors) into a scalar. The old-fashioned terms contravariant and covariant vectors are so because they transform with or in the same manner as (co) and oppositely (contra) to the vector basis (that's just a usual vector basis, such as [math]\vec{i}, \vec{j}, \vec{k}[/math]) respectively. So a (2 0) tensor (such as the stress-energy tensor) is a linear mapping of 2 one-forms into a scalar. It is very important that you understand that a tensor is a linear mapping.
Now for the metric. The metric you will be familiar with, and yet not familiar with due its nature, will be the Euclidean metric. It's signature is 3 (in E³), signature being its trace, and is denoted by the Kronecker Delta, [math]\delta^{\mu}_{\nu}[/math] which if [math]\mu=\nu[/math] is equal to unity (or as a matrix, equal to the identity matrix). The metric is best defined as a symetric (0 2) tensor, i.e. a linear mapping of two vectors into a scalar, and allows one to define a distance. The linear map is defined as the dot product of the two vectors the tensor operates on and the componants of the metric are the value of the dot product of the basis vectors. Let's give an example. In Cartesian 2-space we have as the basis vectors [math]\hat{i}, \hat{j}[/math], then the componants of the metric are [math]\hat{i}\bullet\hat{j}=0, \hat{j}\bullet\hat{i}=0, \hat{i}\bullet\hat{i}=1[/math] and [math]\hat{j}\bullet\hat{j}=1[/math]. And so the dot product of two vectors [math]\vec{A}[/math] and [math]\vec{B}[/math] is simply [math]\sum_iA^iB^i[/math]. Different spaces have different basis vectors and hence different metrics, I mentioned earlier the Schwarzchild metric.
In SR the componants of the (Lorentz) metric are [math]\eta_{\alpha\beta}[/math], which is the same as the metric in ordinary Euclidean 4-space other than the 0 (time) componant is -1 rather than 1. Research metrics for a detailed and much better definition. Mine is slightly weak and lacks detail, the price of brevity I suppose.