What irritates me the most about these conversations is the misuse of the word “random”. The uncertainty principle does not mean the “randomness” principle. It’s a simple proposition that the more accurately you know the position of a particle, the less accurately you can know its momentum (or vice versa). You will note that this is fundamentally a statistical statement (and indeed, in physics these properties are measured in standard deviations). But like all statistics, there’s a tendency towards the mean. In other words, while working with a single electron makes the uncertainty principle very pronounced, working with a hundred or a thousand of them means things start to even out.
A very good example of this is radioactive decay. If I have a single atom of carbon-14 and I sit around waiting for it to decay, it may occur at any time in the next 10,000 years or so, and there’s absolutely no way for me to predict when that atom may decay into nitrogen-14. However, if I have a large quantity of carbon-14 (say, something with 10,000 carbon-14 atoms) all of a sudden the statistical normal model becomes applicable, and we start observing that a large proportion of the C-14 atoms convert to N-14 in about 5,700 years.
So, in other words, for very small sample sizes (like, say, a single quark or a couple of photons), the uncertainty principle is a very big deal. Get a whole lot of those particles, and you find that things begin to fit in to more predictable patterns.
Except, of course, at the beginning of the Universe, where the extreme densities and pressures meant even relatively minor quantum effects would have been writ large. But that’s a whole other conversation.