Hilbert's Sixth problem is not the same as finding the theory of everything and then making the maths rigorous. This is a very common misconception, and has led to people thinking that making renormalisation in QFT rigorous was the main thing to do.
But in fact Hilbert stated explicitly that it would be just as important to axiomatise false physical theories. I interpret this as: well, QM is false since it is not generally covariant, and GR is false since it is not quantum, but it is still important to see whether or not they can be axiomatised.
Archimedes, Newton, Maxwell, and Hertz were all true physicists who published axiomatic treatments of a branch of Physics. Ironically, although Hertz and Maxwell are most famous for their contributions to Electricity and Magnetism, they published axiomatisations of Mechanics alone.
Another misconception is that Kolmogoroff solved the part of Hilbert's problem related to probabilities. This misconception was not shared by Kolmogoroff! He well knew that axiomatising the purely mathematical theory of probabilities was merely a useful preliminary: what Hilbert really wanted was to axiomatise the concepts of physical probability. Within physics, is 'probability' a new, primitive, concept to be added to Hertz's list, along with mass and time, or can it be precisely defined in terms of mass, time, etc. ?
Unless grand unification or renormalisation throw up new axiomatic difficulties, then the only two things left to do to solve Hilbert's Sixth PRoblem are: a) the problem which Wigner pointed out, about the concept of measurement in QM (Bell analysed http://www.chicuadro.es/BellAgainstMeasurement.pdf the problem the same way Wigner did), and b) the definition of physical probability, i.e., the concept of probability which occurs in QM. Hilbert himself was worried about causality in GR but solved that problem himself. Hilbert pointed to the lack of clarity in the relation between Mechanics and Stat Mech, but Darwin and Fowler solved that in the 1920s.
Many physicists, notably H.S. Green in "Observation in Quantum Mechanics," Nuovo Cimento vol. 9 (1958) no. 5, pp. 880-889, posted by me at http://www.chicuadro.es/Green1958.ps.zip, and now more realistic models by Allahverdyan, Balian, and Nieuwenhuizen arXiv:1003.0453, have pointed to the possiblity of fixing the 'measurement' problem Wigner was worried about: they have analysed the physical behaviour of a measurement apparatus and shown that the measurement axioms of QM follow, approximately, from the wave equation. They do this in a logically circular and sloppy way, but the logic can be fixed.
Physical probability can be defined in QM, and its definition there is parallel to its definition in Classical Mechanics: each involves the use of a new kind of thermodynamic limit (in the quantum case http://arxiv.org/abs/quant-ph/0507017, one in which not only does the number of degrees of freedom of the measurement apparatus increase without bound, but Planck's constant goes to zero).
So the people who did the most important work are: Hilbert, Wiener, Weyl, Schroedinger, Darwin, Fowler, Kolmogoroff, Wigner, Khintchine, H.S. Green, Bell, Prof. Jan von Plato, and myself. (Schroedinger could be included twice: he and Debye helped Weyl formulate the first axiomatisation of QM. Later, he influenced H.S. Green in his treatment of measurement as a phase transition.)
More specifically as to your particular concerns
Goedelisation Speaking historically, Goedel's incompleteness theorem has had no influence on those working on this problem. Whether this was myopia or higher wisdom will now be addressed.
Since Physics is about the real world, there are no real worries about its consistency. What is not clear is whether it needs to contain Peano arithmetic. Sets are not physically real, so numbers are not either. It is not even clear whether Physics needs the second-order parts of Logic that produce incompleteness. The usual axioms of QM contain a typical Hamiltonian dynamics, so all physical questions of the form «If the system begins in state $\psi_o$ at time $t_o$, what will be its state at time $t$ ? » are answerable in closed form, and computable to any degree of approximation desired, so the system is physically complete, so to speak. Note that all these questions are essentially first-order questions.
As someone else pointed out, relative consistency is just as interesting as consistency, and there are no real worries about that, either.
Hilbert himself explicitly pointed to his own axiomatisation of Euclidean Geometry as an example for Physics. Those axioms do not allow one to define sets or to construct all the real numbers.
Some have tried to argue that since one can physically build a computer (or even a Turing machine), then the axioms of Physics must imply everything that the theory of computation implies, including its own incompleteness. But this is obviously false: it is physically impossible to build a noiseless digital computer. The Boolean world can only be approximately realised by physically constructed machines. But the proofs of incompleteness are invalidated once you introduce the notion of approximation. No one has even formulated a theory of physically realisable devices that would be parallel to the idealised theory of computation which mathematicians invented.
And conversely: other have tried to argue the other way, that since Physics (certainly QM) is approximately computable, that therefore it must be incomplete. To me this seems merely confused. Not every computable theory satisfies the hypotheses of Goedel's incompleteness theorem: First-order Logic is computable, decidable, complete, and consistent (theorems of Goedel and Herbrand).
undecidable problems in Maths are not physical
Examples of undecidable problems in Maths are: given any set of generators and relations, decide whether the group they determine is non-trivial or not.
Well, Physics doesn't use generators and relations.
Involving Hilbert Spaces: I do not know whether it is undecidable, but it is certainly a wild problem to classify up to unitary equivalence pairs of operators on a given Hilbert space. But in QM, because of relativity, not every subspace of a Hilbert space is physical. Let $G$ be the Lorentz group and $K$ be a maximal compact subgroup of $G$. The only physically significant Hilbert spaces are those with a dense subspace of $K$-finite vectors. So the only physically significant subspaces $V$ of a given Hilbert space are those whose intersection with the $K$-finite vectors are dense in $V$. So no operator whose image does not satisfy this property can be «physical». This tames the problem considerably, essentially reducing it to algebra instead of analysis.
The Halting Problem: many people on this site have already tried to argue that since it involves infinite time behaviour, this is an unphysical problem. To me, this objection seems too easy and philosophical. The stronger objection is that there are no digital computers in the real world. No Turing machines. Because all we can make are noisy approximations to a digital computer or a Turing machine. There are no exactly self-reproducing units in Nature, only approximately reproducing units. (This makes a big difference as to the probabilities involved.) Now, since the theoretical conclusions about incompleteness etc. depend on the precise behaviour of these idealisations, there is no reason to think they hold good for actual noisy machines which halt on their own because they get tired...
¿What would it take to make a Goedel-style revolution in Physics?
Many physicists have already decided that it has taken place--but they are not the ones working on Hilbert's Sixth Problem. Wigner and Bell were capable of understanding Hilbert's axiomatic attitude, and Wigner's analysis of the problem with the axioms of QM is thoroughly in Hilbert's spirit. If the problem Wigner pointed to could not be solved, and if QM stays (in this respect) a fundamental part of Physics (as both me and Steven Weinberg are convinced it will---unlike J.S. Bell, who was convinced the problem was insoluble and therefore QM would be reformed in such a way as to remove the difficulty), then Hilbert's Sixth problem will have suffered the same blow Goedel dealt to his Second Problem. Many physicists have decided, by anticipation, that this is the case.
But there are at least two mainstream views that believe that Wigner's problem can be resolved by demoting the measurement axioms to approximations which can be deduced from the other axioms. The decoherence theory is not yet the consensus of the Physics community, but would save Hilbert's bacon. There are many posts on this forum about the decoherence theory. The line of reasoning I prefer, initiated by H.S. Green and made more realistic by Allahverdyan et al., referred to above, does the same (even though they were not concerned with Hilbert's concerns and thus do not do things in a logically clear way: they make free use of all six axioms while analysing the physics of a measurement apparatus). Feynman was of the opinion that something like this could be done.
The differences between the decoherence approach and the phase-transition approach are physical and should, eventually, be susceptible to experimental tests, to rule out one or the other approach.
This post imported from StackExchange Physics at 2015-12-27 11:28 (UTC), posted by SE-user joseph f. johnson