Those meanings of entropy and that relationship can’t be applied actually in different material science settings, yet they are by and by helpful as we will see in different settings. Be that as it may, there is a well-known regular daily existence similarity. A 1TB hard drive has the ability to store 1TB of data, yet when it is new or crisply deleted, it contains close to zero information.
Consider 4 gas particles in a case. [I picked the modest number 4, to maintain a strategic distance from thermodynamics definitions.] The quantity of conceivable rainbow is corresponding to the quantity of particles.
In the event that the 4 atoms were packed in one corner, at that point we have preferred learning of their situations over on the off chance that they are circulated all through the crate. The fact of the matter is that entropy (and in this manner learning) can differ with the condition of the framework, though the quantity of microstates (data) doesn’t fluctuate with state.
That leads me to definitions that we can use in the rest of this article. “Information and Entropy are properties of the framework and the condition of the framework (and potentially outer factors as well). Data is a property of the framework and autonomous of the state.
In my conclusion, absence of consideration regarding that differentiation among data and learning is the starting point of numerous false impressions with respect to data in material science.
Microstates, Macrostates, and Thermodynamics
A meaning of data steady with our straightforward cases is “the quantity of microstates.” Susskind says[v] that preservation of data could likewise be depicted as protection of differentiation. Particular states never advance into more or less unmistakable states.