# Maximum theoretical data density

+ 10 like - 0 dislike
1864 views

Our ability to store data on or in physical media continues to grow, with the maximum amount a data you can store in a given volume increasing exponentially from year to year. Storage devices continue to get smaller and their capacity gets bigger.

This can't continue forever, though, I would imagine. "Things" can only get so small; but what about information? How small can a single bit of information be?

Put another way: given a limited physical space -- say 1 cubic centimeter -- and without assuming more dimensions than we currently have access to, what is the maximum amount of information that can be stored in that space? At what point does the exponential growth of storage density come to such a conclusive and final halt that we have no reason to even attempt to increase it further?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user tylerl
this is a great question having to do with Bousso's covariant entropy bound - see my answer

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
a hydrogen atom has infinitely many energy eigenstates...

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Mark Eichenlaub
@MarkEichenlaub But surely the higher and higher energy eigenstates fill up more and more space: IIRC there is no bound on the eigenstate "size" as you go higher in energy.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance

+ 9 like - 0 dislike

The answer is given by the covariant entropy bound (CEB) also referred to as the Bousso bound after Raphael Bousso who first suggested it. The CEB sounds very similar to the Holographic principle (HP) in that both relate the dynamics of a system to what happens on its boundary, but the similarity ends there.

The HP suggests that the physics (specifically Supergravity or SUGRA) in a d-dimensional spacetime can be mapped to the physics of a conformal field theory living on it d-1 dimensional boundary.

The CEB is more along the lines of the Bekenstein bound which says that the entropy of a black hole is proportional to the area of its horizon:

$$S = \frac{k A}{4}$$

To cut a long story short the maximum information that you can store in $1 cc = 10^{-6} m^3$ of space is proportional to the area of its boundary. For a uniform spherical volume, that area is:

$$A = V^{2/3} = 10^{-4} m^2$$

Therefore the maximum information (# of bits) you can store is approximately given by:

$$S \sim \frac{A}{A_{pl}}$$

where $A_{pl}$ is the planck area $\sim 10^{-70} m^2$. For our $1 cc$ volume this gives $S_{max} \sim 10^{66}$ bits.

Of course, this is a rough order-of-magnitude estimate, but it lies in the general ballpark and gives you an idea of the limit that you are talking about. As you can see, we still have decades if not centuries before our technology can saturate this bound !

                         Cheers,


Edit: Thanks to @mark for pointing out that $1 cc = 10^{-6} m^3$ and not $10^{-9} m^3$. Changes final result by three orders of magnitude.

On Entropy and Planck Area

In response to @david's observations in the comments let me elaborate on two issues.

1. Planck Area: From lqg (and also string theory) we know that geometric observables such as the area and volume are quantized in any theory of gravity. This result is at the kinematical level and is independent of what the actual dynamics are. The quantum of area, as one would expect, is of the order of $\sim l_{pl}^2$ where $l_{pl}$ is the Planck length. In quantum gravity the dynamical entities are precisely these area elements to which one associates a spin-variable $j$, where generally $j = \pm 1/2$ (the lowest rep of SU(2)). Each spin can carry a single qubit of information. Thus it is natural to associate the planck areas with a single unit of information.

2. Entropy as a measure of Information: There is a great misunderstanding in the physics community regarding the relationship between entropy $S$ - usually described as a measure of disorder - and useful information $I$ such as that stored on a chip, an abacus or any other device. However they are one and the same. I remember being laughed out of a physics chat room once for saying this so I don't expect anyone to take this at face value.

$$S = k_B \ln(N)$$

where $k_B$ is Boltzmann's constant and $N$ the number of microscopic degrees of freedom of a system. For a gas in a box, for eg, $N$ corresponds to the number of different ways to distribute the molecules in a given volume. If we were able to actually use a gas chamber as an information storage device, then each one of these configurations would correspond to a unit of memory. Or consider a spin-chain with $m$ spins. Each spin can take two (classical) values $\pm 1/2$. Using a spin to represent a bit, we see that a spin-chain of length $m$ can encode $2^m$ different numbers. What is the corresponding entropy:

$S \sim \ln(2^m) = m \ln(2) \sim \textrm{number of bits}$

since we have identified each spin with a bit (more precisely qubit). Therefore we can safely say that the entropy of a system is proportional to the number of bits required to describe the system and hence to its storage capacity.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
answered Dec 27, 2010 by (1,985 points)
typo: should be $1cc = 10^{-6}m^3$, $A = 10^{-4}m^2$

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Mark Eichenlaub
@Bruce: $V_2$ obviously; the whole point of holography is that volume doesn't matter at all, only area does :-) Of course, I am not sure to what degree this has been proved (as in calculated microscopically) for generic surfaces (not smooth even?) rather than for horizons of quite generic BH.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Marek
This is a nice answer, but I have to wonder, how much of this information can actually be exploited. It's clear that you have given absolute lower bound on that information. But in reality, we wouldn't be able to modify and read bits from BH's horizon. So I guess a bigger lower bound should exist. Or are you suggesting that all of that holographic information can somehow be managed, even in principle?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Marek
@marek - good point (two comments up). I was thinking along similar lines. Interestingly this line of reasoning sheds light on the geometric nature of the entropy bound, so is worth pursuing in greater detail.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
@marek - this is not a lower bound. It is an upper bound. It determines the maximum amount of information that you can store in a given region. Or am I misunderstanding you? Secondly, I'm not suggesting anything about how such information can be managed. That is a separate question that will lead us to consider limits on information processing and transfer as opposed to storage. There is an interesting article on this by JB Pendry available here

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
In the mathematical discipline of information theory, the entropy of a message is the expected value of the information contained in that message. The formulas are the same, so it shouldn't be surprising that entropy is a measure of information content in physical systems as well.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Nick Alger
Is the answer also true shortly after the big bang ?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user jjcale
+ 2 like - 0 dislike

Ok then, let's say that for a given volume, and nano molecular data retrieval technology. Assuming that you want the data safe, retrievable, made of a long term stable atom what is the maximum data that can usefully be stored.

So firstly we need 1/2 of the total volume to be used for a single molecular layer of your chosen molecule, this will be the "platter" for our "hard drive".

Onto this you place the atoms that represent bits so you have your volume divided by the volume of your chosen molecule/element divided by 2 as the total number of bits.

But with molecular storage, you could use different molecules and have for example,

No molecule = 0 Gold = 1 Platinum =2 Silver = 3

Then you have 4 bit data storage without much loss in size, throw in some carbon 12 and carbon 13 and your up to 6 bit, find some more stable elements and your up to 8 bit and so on.

Of course data retrieval would be terribly slow, but for long term, small size storage. Your talking quadrillions of bits per cm3

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user david mcgowan
answered Sep 7, 2012 by (20 points)
Is it more than the bound of $10^{66}$ per $1cc$ or not that mr. Bekenstein (provided two years ago)? If not then what is the point?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Val
+ 0 like - 0 dislike

I'm not a physicist, but I do know computer science, and keep up with the basics of physics, so let me give another answer for this:

We don't know yet. As long as there are smaller things that can be found, changed, and observed, we can use them to store information.

For example, if a new quantum property is found which can be in state A or state B, that's a new bit. If that's in every billion atoms of something, that's a billion more bits of data. If we then learn to manipulate that property into two additional states (say, right-way-out, and inside-out), then we've just added a new bit, raising that capacity to the power of 2.

So, the problem is that we're still learning what matter and spacetime are made of. Until we come up with a provably correct, unified theory, we don't know how many varying things there are within any material. Given that every single additional state is at least a ^2 change in information density, it's fairly useless to give "ballpark" figures until we know more. So it's probably just better to give something like Moore's Law - a prediction that we'll double the storage every so often, until we run out of new discoveries/technologies.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Lee
answered Sep 30, 2013 by (0 points)

 Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead. To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL. Please consult the FAQ for as to how to format your post. This is the answer box; if you want to write a comment instead, please use the 'add comment' button. Live preview (may slow down editor)   Preview Your name to display (optional): Email me at this address if my answer is selected or commented on: Privacy: Your email address will only be used for sending these notifications. Anti-spam verification: If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:p$\hbar$ysic$\varnothing$OverflowThen drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds). To avoid this verification in future, please log in or register.