Ok, here's what I figured out after you asked me the question:

Recall how (the spinor representation of) Lorentz transformations act on gamma matrices:

\[ S^{-1}(\Lambda)\gamma^{\mu}S(\Lambda)=\Lambda^\mu_{\ \ \nu}\gamma^\nu\cdots(1),\]

where according to Peskin and Schroeder,

\[S(\Lambda)=\begin{bmatrix} S_L(\Lambda) & 0\\0& S_R(\Lambda) \end{bmatrix}\cdots(2),\]

where $S_L$ and $S_R$ are transformations that act on left-handed spinor $\psi_L$ and right-handed spinor $\psi_R$(see P&S's equation (3.37)). And

\[\gamma^\mu=\begin{bmatrix} 0 & \sigma^\mu\\ \bar{\sigma}^\mu& 0 \end{bmatrix}\cdots(3).\]

Plug (2) and (3) into (1) you immediately see

\[S_L^{-1}\sigma^\mu S_R=\Lambda^\mu_{\ \ \nu}\sigma^\nu\cdots(4).\]

Note $S_L$ acts on the row index while $S_R$ acts on the column index, and this is the meaning of

.... that the indices $\alpha, \gamma$ transform in the Lorentz representation of $\psi_L$, while $\beta, \delta$ transform in the separate representation of $\psi_R$...

Clearly this implies that $\sigma^\mu\otimes\sigma_\mu$ is invariant under the transformation of the LHS of (4). In terms of matrix entries, if we define

\[I_{\alpha\gamma\beta\delta}:=(\sigma^\mu)_{\alpha\beta}(\sigma_\mu)_{\gamma\delta}\cdots(5),\]

then

\[(S^{-1}_L)_{\alpha'\alpha}(S^{-1}_L)_{\gamma'\gamma}I_{\alpha\gamma\beta\delta}(S_R)_{\beta\beta'}(S_R)_{\delta\delta'}=I_{\alpha'\gamma'\beta'\delta'}\cdots(6).\]

We need to solve for $I_{\alpha\gamma\beta\delta}$. Now clearly $\epsilon_{\alpha \gamma} \epsilon_{\beta \delta}$ is a solution, because of the identity $\epsilon_{ij}A_{li}A_{kj}=\det(A)\epsilon_{lk}$, and our $S_L, S_R$ both have determinant 1. The proportionality constant 2 can be obtained by comparing appropriate entries on both sides of the equation $(\sigma^\mu)_{\alpha\beta}(\sigma_\mu)_{\gamma\delta}=\text{const}\times\epsilon_{\alpha \gamma} \epsilon_{\beta \delta}$.

Now the only gap remained is the uniqueness of the solution. To prove uniqueness it is convenient to re-write (6) as a matrix equation:\[I(S_R\otimes S_R)=(S_L \otimes S_L)I \cdots(7),\]

where $I$ and $S\otimes S$ are $4\times 4$ matrices, in particular, the row index for $I$ is the pair $\alpha\gamma$ and column index is the pair $\beta\delta$. We are going to apply Schur's lemma to (7).

Recall in the standard representation theory analysis of Lorentz group, $S_L$ is in the $(\frac{1}{2}, 0)$ representation and $S_R$ is in the $(0,\frac{1}{2})$ representation, hence $S_L \otimes S_L\approx (1,0)\oplus (0,0)$ and $S_R \otimes S_R\approx (0,1)\oplus (0,0)$. Note they only share the 1-dimensional representation $(0, 0)$. Then Schur's lemma basically says in suitable basis, matrix $I$ is block diagonal with a 3 by 3 block and a 1 by 1 block, and the 3 by 3 block is a zero matrix, and the 1 by 1 block of course is unique up to scaling. Then return to the original basis you start with, we conclude the matrix $I$ must be unique up to scaling.

Q.E.D

A small caveat: the choice of basis (to have block diagonalization)is only unique up to an arbitrary linear combination within each invariant subspaces, and this freedom is implemented by multiplying a 3+1 block diagonal matrix to your original similarity transfomation, and you can easily show this does not affect the uniqueness.