A recent paper on this subject is Discovering the Capacity of Human Memory, Wang, et al, 2003, Brain and Mind, vol 4, no 2, p. 189-198.
The authors estimate the human brain's memory capacity at 10^8432 bits (yes, that's no typo).
Their basis for this: each possible neural connection path constitutes a memory bit. Once a given pathway is activated, a persistent change is somehow made such that re-activation triggers recalls the memory element (bit?). Thus the maximum capacity is given by the total possible number of connection paths.
Estimates of # neurons vary from about 100 to 500 billion. The average number of synapses per neuron vary from about 3,000 to 7,000. Each possible pathway from any synapse to any other synapse in the brain constitutes a potential unique memory element. The formula for calculating this is simple -- just calculate the number of combinations. We'll conservatively assume 100 billion neurons of 3,000 synapses each:
connection possibilities (unique pathways) = n! / m! * (n - m)!, where
n = number of neurons
m = average # of connections between neurons
= 10^11! / 3000! * (10^11 - 3000)!
It takes a special program to calculate such large factorials, but the result is 10^8432.
While I agree that number of potential neural paths exist, I'm not sure 10^8432 bits of storage is possible. The brain contains very roughly 10^26 atoms. Even if you assume each atom has six degrees of freedom, and that memory bits are stored on the atomic level, that's only about 10^27 bits.
Is there any way to store more data than you have equivalent storage bits? Is there any conceivable way the brain could store data more densely than the atomic level? If not, it would seem that 10^27 bits, not 10^8432 bits would form the maximum theoretical upper limit on memory capacity.
Would appreciate any comment on this, I've tried to figure it out but I'm stumped.