What is NOT Random?



What will happen tomorrow is not random. In other words, it's at least somewhat predictable. I mean, not entirely to be sure, but some things will happen for certain, and other things definitely won't. For example, the sun will rise, water will still freeze at zero degrees celsius, - and you won't become Michael Stevens. - We know this because everything in the universe is made of 12 fundamental particles, and they interact in four predictable ways. - What if I were able to determine the positions and velocities of every single one of these particles in the universe? - Well, you would be the intelligence envisioned by Laplace, who thought if you could really figure out where everything is and how fast it's moving, you would know the entire future of the universe, because you know how every particle interacts with every other particle. - Wow, so nothing would be unpredictable, which means, nothing would be random. - Not even human behavior. Since we are made of the same fundamental particles as everything in the rest of the universe, everything we will ever do, or have ever done, would be determined by the information in the state of the universe at any one time. - But what is information? Well, it seems to be fundamentally about order. The order of molecules in your DNA contain the information needed to make you. It is the order of zeros and ones streaming through the internet that contain all the information required to play this video. It is the order of letters that makes a word, and the order of words that makes a sentence that carries information. So fundamentally, information seems to be about order. Regularity. That is, until you really think about it. I mean, does every letter of a word carry the same amount of information? No. I mean, after a "Q", you know almost for certain that the next letter will be a "U". After a "Th", there will probably be an "E". So these letters carry very little information, because you could predict them beforehand. They are redundant. In fact, the founder of information theory, Claude Shannon, estimated the redundancy of English at about 75%, which is why we can make sense of things like this. So English can be compressed, because it is not random. It has patterns. Similarly, this video is compressible because of its regularities. In each frame, the pixels of similar color cluster together. Plus, from frame to frame, most of the pixels don't change. So you only need to record the ones that do change. - You can take advantage of this technique to create some trippy effects known as datamoshing. It's the application of the movement data from one video, to the pixels of another. - It also means that an average video can be compressed to just one thousandth of its original size. - But what is the most you can compress something? - Well, anything that is not random, any patters or regularities, can be reduced, because they are predictable. So you can continue shrinking a file down until what you're left with is totally random. - And that will contain all of the information of the original item but distilled. Pure information. - So pure information is randomness. If you want to know how much information something contains, you need to know how random it is. And randomness is disorder... What we also call ... - [Both] Entropy. - So information, fundamentally, is entropy. This makes sense if you consider a string of binary digits. For example, this string is perfectly ordered. It has very low entropy, and it contains no information. That's the state of an erased hard drive. Now, this string contains slightly more information, but again, the regularities allow it to be easily compressed. So the string that contains the maximum amount of information is just-- a random set of zeros and ones. It has maximum entropy because it's totally disordered. You could not predict any of those digits by looking at any of the other digits. And if you wanted to send this information to someone, you would have no other option but to send the whole string of digits. There's no way to compress it. But here's the thing about any object that contains maximum information. For us as human beings, they carry no meaning. For example, a video containing maximum information would look like this. It is just white noise. The color of each pixel is independent of all the other pixels, and they all change randomly. This video could not be compressed, because it's already totally random. Now a random sequence of DNA would not make an organism. And a random string of letters does not generally make a word. We are drawn to things that are neither perfectly ordered, containing no information, nor are they perfectly disordered, containing maximum information. Somewhere in the middle, we can recognize complex patterns, and that is where we derive meaning. In music, poetry, and ideas. It is this search for meaning that leads us to propose scientific theories, which if you think about it, are really our way of compressing the universe. For example, general relativity, our current theory of gravity, compresses into one short equation. Everything from how an apple falls to the earth, to how the moon orbits the earth, how all the planets orbit the sun, how the sun orbits a supermassive blackhole at the center of our galaxy, how blackholes form and behave, and how the whole universe expands out from the Big Bang. Now that we have this theory, the future is more predictable. I mean, we can predict eclipses thousands of years into the future. So, with all of our scientific theories, does that mean that the universe is completely not random? That it is perfectly predictable? Well, let's assume for a second that Laplace was right, and that knowing the state of the universe at any one time, means you also know its state at every other time as well. Well, that would mean that the information in our universe would be constant. But if information is entropy, that would mean the entropy of the universe is also constant. And that does not appear to be the universe that we live in. The second law of thermodynamics states that entropy in the universe increases with time. Or in other words, things don't stir themselves apart. But if entropy is going up, that means the information in our universe is constantly increasing. That makes sense, because it would take more information to specify the state of the universe now, than right after the Big Bang. So, where is this new information coming from? My best bet is quantum mechanics. Quantum mechanics describes how the 12 fundamental particles behave. And as spectacularly successful as it is, it is only a probabilistic theory. Meaning that you cannot predict with absolute certainty where an electron, say, will be at some later time. You can only calculate probabilities of where you are likely to find it. So when you do interact with it, and locate the electron at a particular point, you have gained information. You now know something that you couldn't have predicted with certainty beforehand. This drove Einstein crazy. He said, "God does not play dice," referring to this. I mean, he wished that we could compress our theory of quantum mechanics further, so that we could really figure out where these particles were going to be. But maybe the reason why we haven't been able to compress quantum mechanics further, is because fundamentally, it's random. Fundamentally, new information is being generated every time a quantum event like that occurs. In that case, it could be these quantum measurements which are driving up the entropy of the universe. They are creating new information all of the time, and that means the disorder in our universe must go up. This is what we observe as the second law of thermodynamics. You know, we often think about the second law as a curse. As though everything which is ordered is going towards disorder. But maybe, I mean, it's only in a universe where this law is obeyed, that the truly unexpected can occur. That the future can be actually undetermined. For us really to have free will, we need the second law of thermodynamics. Now, you might think that these quantum events are too small to have any meaningful impact on the evolution of the universe, but that is not true. And that's because, there are physical systems which are so dependent, so sensitive to the initial conditions that any tiny change will end up making a big difference later down the track. That's called "chaos." But it's also known as "the butterfly effect." So you and I could be such physical systems. Chaotic systems. And our free will could come from quantum events in our brains. So it looks as though we live in a universe where the future is yet to be determined. That is to say, it is at least somewhat random. - But Derek, what is the most random thing possible in the universe? - That's a good question, Michael. - You know, it's such a good question, I'm talking about it over on Vsauce. Do you wanna go find out about randomness with me? - Let's go check it out. - Alright. - And you can decide whether to click over or not. - Oh, that's nice. - Yeah! - Yeah.

Show random video 🔄

Show all English videos

Языковой детонатор - книга о естественных законах изучения иностранных языков. Как лекго выучить любой язык за 9 месяцев. Читать бесплатно!
❤️DONATE❤️