Difference between revisions of "Logic of information"

MyWikiBiz, Author Your Legacy — Thursday April 25, 2024
Jump to navigationJump to search
(smooth out phrasing)
(workaround for messed up blockquote template + apply column template)
Line 1: Line 1:
The '''logic of information''', or the ''logical theory of information'', considers the information content of logical signs — everything from bits to books and beyond — along the lines initially developed by [[Charles Sanders Peirce]].  In this line of development the concept of information serves to integrate the aspects of logical signs that are separately covered by the concepts of [[denotation]] and [[connotation]], or, in roughly equivalent terms, by the concepts of [[extension]] and [[comprehension (logic)|comprehension]].
+
The '''logic of information''', or the ''logical theory of information'', considers the information content of logical [[semiotics|signs]] — everything from bits to books and beyond — along the lines initially developed by [[Charles Sanders Peirce]].  In this line of development the concept of information serves to integrate the aspects of logical signs that are separately covered by the concepts of [[denotation]] and [[connotation]], or, in roughly equivalent terms, by the concepts of [[extension]] and [[comprehension (logic)|comprehension]].
  
 
Peirce began to develop these ideas in his lectures "On the Logic of Science" at [[Harvard University]] (1865) and the [[Lowell Institute]] (1866).  Here is one of the starting points:
 
Peirce began to develop these ideas in his lectures "On the Logic of Science" at [[Harvard University]] (1865) and the [[Lowell Institute]] (1866).  Here is one of the starting points:
  
<blockquote>
+
{| align="center" cellpadding="8" width="90%"
 +
|
 
<p>Let us now return to the information.  The information of a term is the measure of its superfluous [[comprehension (logic)|comprehension]].  That is to say that the proper office of the comprehension is to determine the [[extension (semantics)|extension]] of the term.  For instance, you and I are men because we possess those attributes — having two legs, being rational, &tc. — which make up the comprehension of ''man''.  Every addition to the comprehension of a term lessens its extension up to a certain point, after that further additions increase the information instead.</p>
 
<p>Let us now return to the information.  The information of a term is the measure of its superfluous [[comprehension (logic)|comprehension]].  That is to say that the proper office of the comprehension is to determine the [[extension (semantics)|extension]] of the term.  For instance, you and I are men because we possess those attributes — having two legs, being rational, &tc. — which make up the comprehension of ''man''.  Every addition to the comprehension of a term lessens its extension up to a certain point, after that further additions increase the information instead.</p>
  
Line 9: Line 10:
  
 
<p>Thus information measures the superfluous comprehension.  And, hence, whenever we make a symbol to express any thing or any attribute we cannot make it so empty that it shall have no superfluous comprehension.  I am going, next, to show that inference is symbolization and that the puzzle of the validity of scientific inference lies merely in this superfluous comprehension and is therefore entirely removed by a consideration of the laws of ''information''.  (C.S. Peirce, "The Logic of Science, or, Induction and Hypothesis" (1866), CE 1, 467).</p>
 
<p>Thus information measures the superfluous comprehension.  And, hence, whenever we make a symbol to express any thing or any attribute we cannot make it so empty that it shall have no superfluous comprehension.  I am going, next, to show that inference is symbolization and that the puzzle of the validity of scientific inference lies merely in this superfluous comprehension and is therefore entirely removed by a consideration of the laws of ''information''.  (C.S. Peirce, "The Logic of Science, or, Induction and Hypothesis" (1866), CE 1, 467).</p>
</blockquote>
+
|}
  
 
==References==
 
==References==
Line 20: Line 21:
  
 
==See also==
 
==See also==
{|
+
 
| valign=top |
+
{{col-begin}}
 +
{{col-break}}
 
* [[Information theory]]
 
* [[Information theory]]
 
* [[Inquiry]]
 
* [[Inquiry]]
Line 27: Line 29:
 
* [[Pragmatic theory of information]]
 
* [[Pragmatic theory of information]]
 
* [[Pragmatic theory of truth]]
 
* [[Pragmatic theory of truth]]
| valign=top |
+
{{col-break}}
 
* [[Pragmaticism]]
 
* [[Pragmaticism]]
 
* [[Pragmatism]]
 
* [[Pragmatism]]
Line 33: Line 35:
 
* [[Semeiotic]]
 
* [[Semeiotic]]
 
* [[Semiosis]]
 
* [[Semiosis]]
| valign=top |
+
{{col-break}}
 
* [[Semiotics]]
 
* [[Semiotics]]
 
* [[Semiotic information theory]]
 
* [[Semiotic information theory]]
Line 39: Line 41:
 
* [[Sign relational complex]]
 
* [[Sign relational complex]]
 
* [[Triadic relation]]
 
* [[Triadic relation]]
|}
+
{{col-end}}
  
 
==External links==
 
==External links==

Revision as of 20:16, 20 February 2009

The logic of information, or the logical theory of information, considers the information content of logical signs — everything from bits to books and beyond — along the lines initially developed by Charles Sanders Peirce. In this line of development the concept of information serves to integrate the aspects of logical signs that are separately covered by the concepts of denotation and connotation, or, in roughly equivalent terms, by the concepts of extension and comprehension.

Peirce began to develop these ideas in his lectures "On the Logic of Science" at Harvard University (1865) and the Lowell Institute (1866). Here is one of the starting points:

Let us now return to the information. The information of a term is the measure of its superfluous comprehension. That is to say that the proper office of the comprehension is to determine the extension of the term. For instance, you and I are men because we possess those attributes — having two legs, being rational, &tc. — which make up the comprehension of man. Every addition to the comprehension of a term lessens its extension up to a certain point, after that further additions increase the information instead.

Thus, let us commence with the term colour; add to the comprehension of this term, that of red. Red colour has considerably less extension than colour; add to this the comprehension of dark; dark red colour has still less [extension]. Add to this the comprehension of non-bluenon-blue dark red colour has the same extension as dark red colour, so that the non-blue here performs a work of supererogation; it tells us that no dark red colour is blue, but does none of the proper business of connotation, that of diminishing the extension at all.

Thus information measures the superfluous comprehension. And, hence, whenever we make a symbol to express any thing or any attribute we cannot make it so empty that it shall have no superfluous comprehension. I am going, next, to show that inference is symbolization and that the puzzle of the validity of scientific inference lies merely in this superfluous comprehension and is therefore entirely removed by a consideration of the laws of information. (C.S. Peirce, "The Logic of Science, or, Induction and Hypothesis" (1866), CE 1, 467).

References

  • De Tienne, André (2006), "Peirce's Logic of Information", Seminario del Grupo de Estudios Peirceanos, Universidad de Navarra, 28 Sep 2006. Eprint.
  • Peirce, C.S. (1867), "Upon Logical Comprehension and Extension", Eprint.

See also

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

External links

Aficionados