# Sub and super multiplicativity of norms for understanding non-locality

+ 8 like - 0 dislike
75 views

In relation to various problems in understanding entanglement and non-locality, I have come across the following mathematical problem. It is most concise by far to state in its most mathematical form and not go into the background much. However, I hope people interested in entanglement theory might be able to see how the problem is interesting/useful.

Here goes. I have two finite dimension vector spaces $A$ and $B$ and each is equipped with a norm (Banach spaces) such that $||...||: A \rightarrow \mathbb{R}$ and $||...||: B \rightarrow \mathbb{R}$. Both the vector spaces and norms are isomorphic to each other. My question concerns norms on the tensor product of these spaces (for simplicity, lets say just the algebraic tensor product) $A \otimes B$ and the dual norms. First let me state something I know to be true.

Lem 1:
If a norm $||...||$ on $A \otimes B$ satisfies:
$||a \otimes b || \leq ||a|| . ||b||$ (sub-multiplicativity)
then the dual norm satisfies
$||a \otimes b ||_{D} \geq ||a||_{D} . ||b||_{D}$ (super-multiplicativity)

where we define the dual of a norm in the usual way as
$|| a ||_{D}= \mathrm{sup} \{ |b^{\dagger}a| ; ||b|| \leq 1 \}$

This lemma crops up often such as in Horn and Johnson Matrix analysis where it is used to prove the duality theorem (that in finite dimensions the bidual equals the original norm $||..||_{DD}=||...|$ ).

I wish to know the status of the converse, which I conjecture will be answered in the affirmative:

Conjecture:
If a norm $||...||$ on $A \otimes B$ satisfies:
$||a \otimes b || \geq ||a|| . ||b||$ (super-multiplicativity)
then the dual norm satisfies
$||a \otimes b ||_{D} \leq ||a||_{D} . ||b||_{D}$ (sub-multiplicativity)

My question is simply "is my conjecture true or does anyone have a counterexample?".

Although I am inclined to think the conjecture is true, it is certainly not as easy to prove as the first stated lemma (which is a 3-4 line proof). The asymmetry enters in the definition of a dual norm, which allows us to "guess" a separable answer at the cost of having underestimated the size of the norm, but we can not so easily overestimate it!

This post has been migrated from (A51.SE)

+ 4 like - 0 dislike

The converse is obviously not true. The asymmetry between the super-multiplicativity and sub-multiplicativity arises because the dual norm is always defined as a supremum and never as an infimum.

To see a counterexample, choose a direction in $A\otimes B$, for example a direction of vectors that are of the form $a\otimes b$, and in a very small "ray" vicinity of this direction, define the norm on the tensor product space as $$||v|| = 1000 ||a||\cdot ||b||$$ Super-multiplicativity will still obviously hold because we have increased the norm somewhere on the tensor product space while kept it constant on the rest of it.

However, the dual norm skyrockets by this tiny change because it's a supremum over all $c$ with $||c|| \leq 1$ which includes $c\approx a\otimes b$ where the norm was amplified. Correspondingly, the dual norm for certain dual vectors has been essentially increased to 1,000 times what it was before and is no-longer sub-multiplicative.

Warning: the argument above is wrong. I have misinterpreted $|b^\dagger a|$ as something that depends on the original norm but it doesn't. The reverted implication is likely to be right at least for some "convex" norms for which the switching between the norm and the dual norm is fully reversible. Please post more complete answers if you can construct them.

OK, I think that the basic argument may still be easily fixed. Take a natural norm and redefine $$||v|| = 0.001 ||a||\cdot ||b||$$ just for some $v$ being of the form $C\cdot M(a\otimes b)$ where $a,b$ are some generic vectors, $M$ is a transformation close to the identity which can't be factorized to the tensor products of transformations on the two spaces, and $C\in{\mathbb R}$. This reduction of the norm doesn't spoil super-multiplicativity because this condition only constrains the tensor products and this is not one. However, on the dual space, $(a\otimes b)_D$ calculated by some dual form will fail to be sub-multiplicative because it's affected even by "nearby" vectors on the original space, and we allowed some very long vectors (according to the original norm) to influence the supremum.

So this won't hold for sufficiently unusual norms. Some kind of convexity that would guarantee that the dualization procedure squares to one could be enough to guarantee that your reverse statement is valid.

This post has been migrated from (A51.SE)
answered Oct 3, 2011 by (10,258 points)
Thanks for your answer. Though I'm not sure I understand why you say this causes the dual norm to skyrocket upwards, I would have thought it causes the norm of certain dual vectors to reduce by 1,000 times. If we set $||a||=||b||=1$ then $||a \otimes b ||=1000$ is not less than 1 and so does not fall into the unit ball over which the supreme is evaluated. More simply, if we equivalently formulate the dual norm as a sup over $|u^{\dagger}v| / ||v||$ then it looks like any adhoc increase of $||v||$ is only going to decrease the dual norm.

This post has been migrated from (A51.SE)
To expand on the above. Assume that the base norms are the 2-norm, and that on $A \otimes B$ we have a norm s.t. $||v|| \geq ||v||_{2}$. When $v=a \otimes b$, and using $||a \otimes b ||_{2}=||a||_{2}||b||_2$, superadditivity follows. From this subaddivity follows, as $||v||_{D}= \sup_{v} \{ |u^{\dagger}v| / ||u|| \} \leq \sup_{v} \{ |u^{\dagger}v| / ||u||_{2} \} = ||v||_{2}$. From this putting $v=a \otimes b$ and multiplicity of the 2-norm we get subaddivity.

This post has been migrated from (A51.SE)
If we make your example more explicit. E.g. we say if $u=\sum_{j,k}c_{j,k} a_{j}\otimes b_{j}$ is a decomposition in a specific basis and then define $||u||= \sqrt{ \sum_{j,k} d_{j,k} c_{j,k}^{2} }$ where $d_{j,k}=1000$ when $j=k=1$ and $d_{j,k}=1$ otherwise, then it follows that $||u|| \geq ||u||_{2}$ and so the dual must really be submultiplicative!

This post has been migrated from (A51.SE)
There may be a mistake in my argument, thanks for pointing it out. Will look at it again.

This post has been migrated from (A51.SE)
Dear @Earl, I think that I have fixed the error in my argument and the conclusion is unchanged. Reduce the original norm to 1/1000 of it in a ray of vectors that are "nearly" tensor-factorizable. This doesn't spoil the super-multiplicativity because only strictly tensor products are constrained. However, the dual norm will be affected by this change, even the dual norm of factorizable vectors, and it will jump 1,000 times or so, spoiling sub-multiplicativity. Agreed? Some convexity or triangle inequality for the norm could be enough to ban variable norms of my type and revive your theorem.

This post has been migrated from (A51.SE)
Ah, I see. I think you correction works now. Let me work through an even more concrete example. Consider $u$ on the interval $u_{\lambda}=\lambda a_{0}\otimes b_{0}+(1-\lambda)a_{1}\otimes b_{1}$, and define a norm such that $||u||= (2 \lambda-1)^{2} + \epsilon$ where $\epsilon$ is small but nonzero (e.g. 1/1000). $||u||$ is supermultiplicative on tensor products and convex. Then $|| a_{0}\otimes b_{0} ||_{D} \geq |u_{\lambda=1/2}v^{\dagger}|/||u_{\lambda=1/2}|| = 1/(2 \epsilon)$ which can be made arbitrarily large.

This post has been migrated from (A51.SE)
Finally, one more comment. I think the conditions under which the converse does hold are precisely when there exists a cross norm $\eta(u)$ (e.g. the smallest cross-norm) such that $||u|| \geq \eta (u)$. Then one can follow the argument I used a few comments up for the more specific case of the 2-norm. However, your counterexamples are so deeply convex they achieve lower values than any cross norm can.

This post has been migrated from (A51.SE)

 Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead. To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL. Please consult the FAQ for as to how to format your post. This is the answer box; if you want to write a comment instead, please use the 'add comment' button. Live preview (may slow down editor)   Preview Your name to display (optional): Email me at this address if my answer is selected or commented on: Privacy: Your email address will only be used for sending these notifications. Anti-spam verification: If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:p$\hbar\varnothing$sicsOverflowThen drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds). To avoid this verification in future, please log in or register.