We can evaluate scarcity of resources intuitively; the more rare it is, the more value it is. But scarcity of economics are measured by supply and demand, it can't be a unit which had clearly definition. We can get probability of resources generated on map, but the probability wasn't enough to measured agent, it needed been processed additionally.
The nature of Shannon entropy is mathematics quantity which well known "uncertainty of phenomenon".1
H(X)=E[I(X)]=E[−ln(P(X))]
and I is Shannon Information, it be defined as:
I(X)=−log2P(X)
We can noted the meaning of −log2pi is "the lower the probability of an event, the greater the information carried when the event occurs", so Shannon entropy is expected value of Shannon information, can be express by:
To describe pattern of items changes easily, which caused by crafting or another mechanism in the game, I borrow the form of chemical equation:
aX+bYPcZ
a,b,c:amount of materials or products
X,Y:materials, items would vanished after crafting process
Z:products
P:the path of crafting, item(s) won't vanished after crafting process
The nature of Minecraft World are a bunch of digital data, so were reasonably considered it as a string of information, and quantify it as Shannon Entropy or Information. For a block which status was uncertain, the value is:
H(X)=E(I)=−i∑pilog2pi
and pi is probability of (some type of) block generated in map, therefore we can get information of block of certain type by:
It could calculation entropy of nature resources by probability of generated in the map, but most of items are obtained by crafting in Minecraft, when I mention "crafting" are meaning "the mechanism can be expressed by formula of crafting in the game". Need to gain entropy of non-nature items through calculation.
Let do a crafting example:
X+YWZ
X,Y,W represents three different types of blocks, can be treated as three randomly independently event, and their probability of exists in map are pX,pY,pW, so we can get probability of event Z happened was multiplied by probabilities, also was summing by information of probabilities:
We must define "system", when quantified information of system, so need to defined a range of space and how many blocks or items contained in the space.There are 3 kind of definition of system which common been used under below:
Complete System
Blocks, items in inventory block, item entity dropped and items in player inventory, which contained in a limited space.
Block System
Blocks and items in inventory block, which contained in a limited space.
Player System
Items in the player inventory.
We can quantify entropy of system through rules has been created, and intervening variables can been gain by making statistics on map. We can expect the entropy of environment would changed through behavior of agent, therefore those changes could be one of index of evaluated the agent.
The following information evaluation value is for reference only, there are not included every items in the game, and process are not very rigorous. The goal is demonstrated the theory of this article, anyone are interested can finished the table by self.
Using Cuberite 1.7.X-linux642 to generate map and making statistics by plugin3. There were sampling 10 times, and remove map file to generate new seed every time. To simplify data, I merged yellow flower and red flower, removed flowing water, flowing lava, cobblestone, torch, flame, monster spawner, chest and dead bush, replace them by air.
It is summary those value are calculated or mention in this article, and just as I said, this article didn't calculated all information of items in the game, therefore the table are unfinished:
Connecting to opencv server with bitmap… average time during: 0.0010088205337524414 Connecting to opencv server with png… average time during: 0.008921647071838379
That's the question which many people working on artificial intelligence would asked. Let standing at the basic of natural science to look this question.
The arrow of time is the "one-way direction" or "asymmetry" of time. The thermodynamic arrow of time is provided by the second law of thermodynamics, which says that in an isolated system, entropy tends to increase with time. Entropy can be thought of as a measure of microscopic disorder; thus the second law implies that time is asymmetrical with respect to the amount of order in an isolated system: as a system advances through time, it becomes more statistically disordered. This asymmetry can be used empirically to distinguish between future and past, though measuring entropy does not accurately measure time. 1 So there for would be consequent between past and future, "cause" and "effect" .
The intelligent creatures just a logical system to make reaction with environment information, which the nature of intelligent is a mapping relationship, the informations from environment are cause and the reactions of creature made are effect. So we can express intelligence of creatures by boolean function but huge and complicate.
It's undeniable that artificial neural network had flexibility which creating many kind of logical mapping relationship, but the artificial neural network that based floating point arithmetic cause many information losing, computer cost too many time and energy to calculating the information that we don't needed.
Of course there are information also loses in boolean calculation, but it necessary process which extracting value information for agents, and unliked artificial neural network, boolean algebra had more interpretability, we can translate relation between input and output.
There for, why not to use both of artificial neural network and boolean algebra? This is a formula of neuron of artificial neural network:
y=i=1∑nxiwi+b
and its formula of neuron of BNN (boolean neural network):
Because BNN have some properties which logic array don't had:
Logic array cannot contain recurrent structure (memory unit). But major problem is interpretability, as know as the hidden layers in the artificial neural network can be explained as extracting features and provided more specific concepts, there for logic array cannot provided tree structure to do that.
Just like biology, we observe subjects and make inductive science system to study them.
There might reasonably be expected from some patterns of BNN occurred repeatedly and been observed, that means those patterns are GLU (generic logic unit). When we had created knowledge base of GLU, would allow us to explanation how BNN "thinking".