• OBJECTION!@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    16 days ago

    Entropy is a record of everything that’s ever happened in the universe. The units for it are Joules per Kelvin.

      • OBJECTION!@lemmy.ml
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        15 days ago

        That’s an interpretation from physics in the context of information theory. Leonard Susskind’s Black Hole War explains some of these concepts, and I researched it further for a project getting my B.S. in physics. It’s been a while but I’ll do my best to break it down.

        Imagine that a murder has happened and the murderer wants to cover up the crime. The gun and body contain physical information that the cops could use to reconstruct the murder, so the killer throws them in the river.

        Why is the river useful to the killer? Because it’s a chaotic (entropic) system that contains a bunch of particles doing all sorts of things. The information contained in the gun isn’t actually lost or destroyed, it’s just made harder to access by mixing it with this “junk” information. Likewise, the blood of the body mixes with with the water and is diluted to the point of being impossible to find, but the it isn’t actually destroyed, it’s just mixed in.

        Suppose we could freeze time and examine that river down to the particle level. If we found a single particle of blood, we could look at it’s position and momentum, and that of every particle it interacted with, and we could trace it all the way back to the body (this might be easier to understand if instead of a river, we say the water is crystal clear and uniform). Obviously, this isn’t something that could be done feasibly, but theoretically, there’s no reason you couldn’t put Humpty Dumpty together again.

        When you put two liquids in a flask and shake it, the information of precisely how you shook it is contained in the particles in that liquid. Every particle now has a story: before, they were sitting around with all their particle friends but then the shake happened and everyone wound up in a slightly different location because of the precise way that the shake affected them as opposed to their neighbors. The information about the shake is there in the particles in the flask.

        To say that a system is more entropic is to say that more physical events have happened in that system. The particles become more dispersed because more things have happened to them leaving behind physical impressions that become harder and harder to trace back as the amount of things that have happened to that system increase, because there’s more information to sort out. This is where we can think of entropy as “a record of everything that’s ever happened.”


        Still with me? Ready for extra credit?

        Can physical information ever actually be destroyed, erased from the universe entirely, as opposed to just being scrambled? That’s the question at the heart of one of the biggest unresolved paradocies in the modern understanding of physics: The Black Hole Information Paradox.

        If you commit a murder on a spaceship and then fly that spaceship into a black hole, is there any way, even theoretically, to recover the information of what happened on that spaceship? Has the information merely been scrambled like the body in the river, or is it truly destroyed and erased from the universe entirely? If it is just scrambled, then where is that information contained?

        This is what Susskind’s book I mentioned before is all about. Stephen Hawking once maintained that the information was completely erased, while Susskind argued that this was incompatible with quantum mechanics and the second law of theromodynamics, that by erasing information, entropy was being decreased in a closed system. Hawking later changed his position and conceded that he’d been mistaken. Today, it’s believed that the information is scrambled, but what nobody knows is where the information is contained, and every proposed solution seems to contradict some fundamental aspect of our understanding of physics.

        • cockmushroom@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 days ago

          what nobody knows is where the information is contained

          I expect such musings to be an affront to the other fellow who commented to answer my question by saying that information is conserved microscopically and not macroscopically but, if you’ll hear me out, has anybody looked into the possibility that the amount information retained by the … carrier … about some specific event approaches zero as it undergoes more interactions and acquires new state/information pertaining other events? That is, the system retains everything, but no single part holds a non-increasingly negligible trace. So while, theoretically, you could hunt down participants in all of its interactions and deduce meaningful evidence about its history, the technical act of doing so is practically impossible because the problem of discernment grows with the age of every other carrier within some volume; making it intractable or, at least, wasteful to the extent possible. Could this agree with what is usually meant by “entropy tending to disorder”?

          All that said, I often think this way of speaking of entropy is somewhat unhelpful in that there are many forms of entropy and not all should obey the second law. Some are constants, others vary with measurement, most are mutually unrelated, and some are in disguise. Take position entropy. One way to look at it is to see how many things are in the universe at different locations; if you count all that up you have a measurement (aka volume); maybe divide by the number of things for comparability’s sake and would you look at that? It’s density. Another chap chimes in saying something to the effect of “can’t fool me, position entropy’s just ħ/2Δp summed over all event participants”. Call me pedantic, but it’s not obvious that these measurements must agree; but they’re both physically and thermodynamically significant. Honestly, i really don’t know, but when i look at the 2nd law, i really wonder if that’s the whole picture. Does entropy really contain history or is it just a byproduct of the generation of information?

          • OBJECTION!@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 days ago

            So while, theoretically, you could hunt down participants in all of its interactions and deduce meaningful evidence about its history, the technical act of doing so is practically impossible because the problem of discernment grows with the age of every other carrier within some volume; making it intractable or, at least, wasteful to the extent possible.

            When we’re talking about things like following blood particles back through a river, we have fully left behind the realm of practicality and wastefulness. Information can be scrambled well beyond the point of making recovery feasible, but we’re talking about whether it theoretically exists.

            One way to look at it is where the “cut-off” point would be. While the blood is streaming away from the body, we can see exactly where it’s coming from. What makes us lose track of it are the limitations of our instruments.

            If you can find a trace of blood and reconstruct where it was even a second ago, then there’s no reason (apart from the practical ones) you couldn’t repeat that process and get the location a second before that, and so on.

            All that said, I often think this way of speaking of entropy is somewhat unhelpful in that there are many forms of entropy and not all should obey the second law.

            I don’t think there’s any way of getting around the second law, period.

            Take position entropy. One way to look at it is to see how many things are in the universe at different locations; if you count all that up you have a measurement (aka volume); maybe divide by the number of things for comparability’s sake and would you look at that? It’s density. Another chap chimes in saying something to the effect of “can’t fool me, position entropy’s just ħ/2Δp summed over all event participants”. Call me pedantic, but it’s not obvious that these measurements must agree

            Ngl, you lost me. Like I said, I’m rusty with this stuff.

      • CanadaPlus@futurology.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 days ago

        Wild guess, as not-OP: Information is conserved at the microscopic level, while it’s really not macroscopically, and entropy is (the logarithm of) the missing piece. Ergo, it contains a record of everything that’s happened that’s no longer directly visible.

    • CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      15 days ago

      Which honestly says more about what temperature is than about entropy or information. (It can be defined as energy per entropy)