MyWikiBiz, Author Your Legacy — Thursday November 28, 2024
Jump to navigationJump to search
101 bytes added
, 20:24, 13 August 2009
Line 2,434: |
Line 2,434: |
| |} | | |} |
| | | |
− | In the medium of these unassuming examples, we begin to see the activities of logical inference and methodical inquiry as ''information clarifying operations'' (ICO's). | + | In the medium of these unassuming examples, we begin to see the activities of logical inference and methodical inquiry as ''information clarifying operations''. |
| | | |
− | First, we drew a distinction between information maintaining and information reducing processes and we noted the related distinction between equational and implicational inferences. I will use the acronyms EROI and IROI, respectively, for the equational and implicational analogues of the various rules of inference. | + | First, we drew a distinction between information preserving and information reducing processes and we noted the related distinction between equational and implicational inferences. I will use the acronyms EROI and IROI, respectively, for the equational and implicational analogues of the various rules of inference. |
| | | |
| For example, we considered the brands of ''information fusion'' that are involved in a couple of standard rules of inference, taken in both their equational and their illative variants. | | For example, we considered the brands of ''information fusion'' that are involved in a couple of standard rules of inference, taken in both their equational and their illative variants. |
| | | |
− | In particular, let us assume that we begin from a state of uncertainty about the universe of discourse ''X'' = '''B'''<sup>3</sup> that is standardly represented by a uniform distribution ''u'' : ''X'' → '''B''' such that ''u''(''x'') = 1 for all ''x'' in ''X'', in short, by the constant proposition 1 : ''X'' → '''B'''. This amounts to the ''maximum entropy sign state'' (MESS). As a measure of uncertainty, let us use either the multiplicative measure given by the cardinality of ''X'', commonly notated as |''X''|, or else the additive measure given by log |X|. In this frame we have |''X''| = 8 and log |''X''| = 3, to wit, 3 bits of doubt. | + | In particular, let us assume that we begin from a state of uncertainty about the universe of discourse <math>X = \mathbb{B}^3</math> that is standardly represented by a uniform distribution <math>u : X \to \mathbb{B}</math> such that <math>u(x) = 1\!</math> for all <math>x\!</math> in <math>X,\!</math> in short, by the constant proposition <math>1 : X \to \mathbb{B}.</math> This amounts to the ''maximum entropy sign state'' (MESS). As a measure of uncertainty, let us use either the multiplicative measure given by the cardinality of <math>X,\!</math> commonly notated as <math>|X|,\!</math> or else the additive measure given by <math>\log_2 |X|.\!</math> In this frame we have <math>|X| = 8\!</math> and <math>\log_2 |X| = 3,\!</math> to wit, 3 bits of doubt. |
| | | |
| Let us now consider the various rules of inference for transitivity in the light of their performance as information-developing actions. | | Let us now consider the various rules of inference for transitivity in the light of their performance as information-developing actions. |