In a static, spatially infinite and eternal universe with an homogeneous distribution of eternal stars (or a constant homogeneous stellar population) the sum of the flux (the amount of energy that reaches a surface) of all stars at each point of the universe would be infinite. This is known as Olbers’ paradox. Note that stars (or the stellar population) are not eternal since the universe was not eternal in past. This fact suffices to solve Olbers’ paradox.
There is, however, another way to solve this paradox without the need of a temporally finite universe (or a finite lifetime of the homogeneous stellar population). If one considers a spatially infinite and eternal universe in which space expands (a de-Sitter model, with constant Hubble parameter leading to a strongly accelerated expansion), Olbers’ paradox is also solved. Any kind of radiation background due to the electromagnetic emission of stars (or whatever; CMB, etc.) would loss enough energy due to the strong redshift. In such a model the integral (sum of the flux due to all stars) would be finite, although there would be an contribution from an infinite number of stars!
In other expanding models this is not true. First, they are not eternal (the de-Sitter model is the only one without initial singularity). But, besides of this, one could consider a situation in a very far future in which a (non de-Sitter) universe is very old and the flux of lots of stars is reaching each point (assuming again a constant stellar population). In such a case, expansion is actually reshifting the radiation background, but this is not enough to “dilute” the total flux of energy, which would be increasing with time. This is because the Hubble parameter decreases with time in every model which is not de-Sitter.