Changes

update + waybak links
Line 1: Line 1:  
{{DISPLAYTITLE:Cactus Language}}
 
{{DISPLAYTITLE:Cactus Language}}
 
+
'''Author: [[User:Jon Awbrey|Jon Awbrey]]'''
<div class="nonumtoc">__TOC__</div>
      
==The Cactus Patch==
 
==The Cactus Patch==
Line 9: Line 8:  
<p>Thus, what looks to us like a sphere of scientific knowledge more accurately should be represented as the inside of a highly irregular and spiky object, like a pincushion or porcupine, with very sharp extensions in certain directions, and virtually no knowledge in immediately adjacent areas.  If our intellectual gaze could shift slightly, it would alter each quill's direction, and suddenly our entire reality would change.</p>
 
<p>Thus, what looks to us like a sphere of scientific knowledge more accurately should be represented as the inside of a highly irregular and spiky object, like a pincushion or porcupine, with very sharp extensions in certain directions, and virtually no knowledge in immediately adjacent areas.  If our intellectual gaze could shift slightly, it would alter each quill's direction, and suddenly our entire reality would change.</p>
 
|-
 
|-
| align="right" | &mdash; Herbert J. Bernstein, "Idols of Modern Science", [HJB, 38]
+
| align="right" | &mdash; Herbert J. Bernstein, &ldquo;Idols of Modern Science&rdquo;, [HJB, 38]
 
|}
 
|}
   Line 42: Line 41:  
In the usual way of proceeding on formal grounds, meaning is added by giving each grammatical sentence, or each syntactically distinguished string, an interpretation as a logically meaningful sentence, in effect, equipping or providing each abstractly well-formed sentence with a logical proposition for it to denote.  A semantic interpretation of the cactus language is carried out in Subsection 1.3.10.12.
 
In the usual way of proceeding on formal grounds, meaning is added by giving each grammatical sentence, or each syntactically distinguished string, an interpretation as a logically meaningful sentence, in effect, equipping or providing each abstractly well-formed sentence with a logical proposition for it to denote.  A semantic interpretation of the cactus language is carried out in Subsection 1.3.10.12.
   −
==The Cactus Language : Syntax==
+
===The Cactus Language : Syntax===
    
{| align="center" cellpadding="0" cellspacing="0" width="90%"
 
{| align="center" cellpadding="0" cellspacing="0" width="90%"
Line 239: Line 238:  
{| align="center" cellpadding="4" style="text-align:center" width="90%"
 
{| align="center" cellpadding="4" style="text-align:center" width="90%"
 
|-
 
|-
| <math>\varepsilon</math>
+
| <math>\varepsilon\!</math>
 
| =
 
| =
| <math>^{\backprime\backprime\prime\prime}</math>
+
| <math>{}^{\backprime\backprime\prime\prime}\!</math>
 
| =
 
| =
 
| align="left" | the empty string.
 
| align="left" | the empty string.
 
|-
 
|-
| <math>\underline\varepsilon</math>
+
| <math>\underline\varepsilon\!</math>
 
| =
 
| =
| <math>\{ \varepsilon \}</math>
+
| <math>\{ \varepsilon \}\!</math>
 
| =
 
| =
 
| align="left" | the language consisting of a single empty string.
 
| align="left" | the language consisting of a single empty string.
Line 310: Line 309:  
</ol>
 
</ol>
   −
The easiest way to define the language <math>\mathfrak{C}(\mathfrak{P})</math> is to indicate the general sorts of operations that suffice to construct the greater share of its sentences from the specified few of its sentences that require a special election.  In accord with this manner of proceeding, I introduce a family of operations on strings of <math>\mathfrak{A}^*</math> that are called ''syntactic connectives''.  If the strings on which they operate are exclusively sentences of <math>\mathfrak{C}(\mathfrak{P}),</math> then these operations are tantamount to ''sentential connectives'', and if the syntactic sentences, considered as abstract strings of meaningless signs, are given a semantics in which they denote propositions, considered as indicator functions over some universe, then these operations amount to ''propositional connectives''.
+
The easiest way to define the language <math>\mathfrak{C}(\mathfrak{P})\!</math> is to indicate the general sorts of operations that suffice to construct the greater share of its sentences from the specified few of its sentences that require a special election.  In accord with this manner of proceeding, I introduce a family of operations on strings of <math>\mathfrak{A}^*\!</math> that are called ''syntactic connectives''.  If the strings on which they operate are exclusively sentences of <math>\mathfrak{C}(\mathfrak{P}),\!</math> then these operations are tantamount to ''sentential connectives'', and if the syntactic sentences, considered as abstract strings of meaningless signs, are given a semantics in which they denote propositions, considered as indicator functions over some universe, then these operations amount to ''propositional connectives''.
    
Rather than presenting the most concise description of these languages right from the beginning, it serves comprehension to develop a picture of their forms in gradual stages, starting from the most natural ways of viewing their elements, if somewhat at a distance, and working through the most easily grasped impressions of their structures, if not always the sharpest acquaintances with their details.
 
Rather than presenting the most concise description of these languages right from the beginning, it serves comprehension to develop a picture of their forms in gradual stages, starting from the most natural ways of viewing their elements, if somewhat at a distance, and working through the most easily grasped impressions of their structures, if not always the sharpest acquaintances with their details.
Line 321: Line 320:  
<p>The ''concatenation'' of one string <math>s_1\!</math> is just the string <math>s_1.\!</math></p>
 
<p>The ''concatenation'' of one string <math>s_1\!</math> is just the string <math>s_1.\!</math></p>
   −
<p>The ''concatenation'' of two strings <math>s_1, s_2\!</math> is the string <math>s_1 \cdot s_2.\!</math></p>
+
<p>The ''concatenation'' of two strings <math>s_1, s_2\!</math> is the string <math>{s_1 \cdot s_2}.\!</math></p>
   −
<p>The ''concatenation'' of the <math>k\!</math> strings <math>(s_j)_{j = 1}^k</math> is the string of the form <math>s_1 \cdot \ldots \cdot s_k.\!</math></p></li>
+
<p>The ''concatenation'' of the <math>k\!</math> strings <math>(s_j)_{j = 1}^k\!</math> is the string of the form <math>{s_1 \cdot \ldots \cdot s_k}.\!</math></p></li>
    
<li>
 
<li>
Line 471: Line 470:  
| 6. || <math>s\!</math> is the ''surcatenation'' of the <math>k\!</math> strings <math>s_1, \ldots, s_k\!</math> in <math>\mathfrak{L},</math>
 
| 6. || <math>s\!</math> is the ''surcatenation'' of the <math>k\!</math> strings <math>s_1, \ldots, s_k\!</math> in <math>\mathfrak{L},</math>
 
|-
 
|-
| &nbsp; || if and only if <math>s_j\!</math> is a sentence of <math>\mathfrak{L},</math> for all <math>j = 1 \ldots k,\!</math> and
+
| &nbsp; || if and only if <math>s_j\!</math> is a sentence of <math>\mathfrak{L},</math> for all <math>{j = 1 \ldots k},\!</math> and
 
|-
 
|-
 
| &nbsp; || <math>s \ = \ \operatorname{Surc}_{j=1}^k s_j \ = \ ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, s_1 \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, \ldots \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, s_k \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}.</math>
 
| &nbsp; || <math>s \ = \ \operatorname{Surc}_{j=1}^k s_j \ = \ ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, s_1 \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, \ldots \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, s_k \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}.</math>
Line 497: Line 496:  
|}
 
|}
   −
As usual, saying that <math>s\!</math> is a sentence is just a conventional way of stating that the string <math>s\!</math> belongs to the relevant formal language <math>\mathfrak{L}.</math>  An individual sentence of <math>\mathfrak{C} (\mathfrak{P}),</math> for any palette <math>\mathfrak{P},</math> is referred to as a ''painted and rooted cactus expression'' (PARCE) on the palette <math>\mathfrak{P},</math> or a ''cactus expression'', for short.  Anticipating the forms that the parse graphs of these PARCE's will take, to be described in the next Subsection, the language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P})</math> is also described as the set <math>\operatorname{PARCE} (\mathfrak{P})</math> of PARCE's on the palette <math>\mathfrak{P},</math> more generically, as the PARCE's that constitute the language <math>\operatorname{PARCE}.</math>
+
As usual, saying that <math>s\!</math> is a sentence is just a conventional way of stating that the string <math>s\!</math> belongs to the relevant formal language <math>\mathfrak{L}.</math>  An individual sentence of <math>\mathfrak{C} (\mathfrak{P}),\!</math> for any palette <math>\mathfrak{P},</math> is referred to as a ''painted and rooted cactus expression'' (PARCE) on the palette <math>\mathfrak{P},</math> or a ''cactus expression'', for short.  Anticipating the forms that the parse graphs of these PARCE's will take, to be described in the next Subsection, the language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P})</math> is also described as the set <math>\operatorname{PARCE} (\mathfrak{P})</math> of PARCE's on the palette <math>\mathfrak{P},</math> more generically, as the PARCE's that constitute the language <math>\operatorname{PARCE}.</math>
   −
A ''bare'' PARCE, a bit loosely referred to as a ''bare cactus expression'', is a PARCE on the empty palette <math>\mathfrak{P} = \emptyset.</math>  A bare PARCE is a sentence in the ''bare cactus language'', <math>\mathfrak{C}^0 = \mathfrak{C} (\emptyset) = \operatorname{PARCE}^0 = \operatorname{PARCE} (\emptyset).</math>  This set of strings, regarded as a formal language in its own right, is a sublanguage of every cactus language <math>\mathfrak{C} (\mathfrak{P}).</math>  A bare cactus expression is commonly encountered in practice when one has occasion to start with an arbitrary PARCE and then finds a reason to delete or to erase all of its paints.
+
A ''bare'' PARCE, a bit loosely referred to as a ''bare cactus expression'', is a PARCE on the empty palette <math>\mathfrak{P} = \varnothing.</math>  A bare PARCE is a sentence in the ''bare cactus language'', <math>\mathfrak{C}^0 = \mathfrak{C} (\varnothing) = \operatorname{PARCE}^0 = \operatorname{PARCE} (\varnothing).</math>  This set of strings, regarded as a formal language in its own right, is a sublanguage of every cactus language <math>\mathfrak{C} (\mathfrak{P}).</math>  A bare cactus expression is commonly encountered in practice when one has occasion to start with an arbitrary PARCE and then finds a reason to delete or to erase all of its paints.
    
Only one thing remains to cast this description of the cactus language into a form that is commonly found acceptable.  As presently formulated, the principle PC&nbsp;4 appears to be attempting to define an infinite number of new concepts all in a single step, at least, it appears to invoke the indefinitely long sequences of operators, <math>\operatorname{Conc}^k</math> and <math>\operatorname{Surc}^k,</math> for all <math>k > 0.\!</math>  As a general rule, one prefers to have an effectively finite description of
 
Only one thing remains to cast this description of the cactus language into a form that is commonly found acceptable.  As presently formulated, the principle PC&nbsp;4 appears to be attempting to define an infinite number of new concepts all in a single step, at least, it appears to invoke the indefinitely long sequences of operators, <math>\operatorname{Conc}^k</math> and <math>\operatorname{Surc}^k,</math> for all <math>k > 0.\!</math>  As a general rule, one prefers to have an effectively finite description of
Line 527: Line 526:  
# To specify the intension or to signify the intention that every string that fits the conditions of the abstract type <math>T\!</math> must also fall under the grammatical heading of a sentence, as indicated by the type <math>S,\!</math> all within the target language <math>\mathfrak{L}.</math>
 
# To specify the intension or to signify the intention that every string that fits the conditions of the abstract type <math>T\!</math> must also fall under the grammatical heading of a sentence, as indicated by the type <math>S,\!</math> all within the target language <math>\mathfrak{L}.</math>
   −
In these types of situation the letter <math>S,\!</math> that signifies the type of a sentence in the language of interest, is called the ''initial symbol'' or the ''sentence symbol'' of a candidate formal grammar for the language, while any number of letters like <math>T,\!</math> signifying other types of strings that are necessary to a reasonable account or a rational reconstruction of the sentences that belong to the language, are collectively referred to as ''intermediate symbols''.
+
In these types of situation the letter <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> that signifies the type of a sentence in the language of interest, is called the ''initial symbol'' or the ''sentence symbol'' of a candidate formal grammar for the language, while any number of letters like <math>^{\backprime\backprime} T \, ^{\prime\prime}</math> signifying other types of strings that are necessary to a reasonable account or a rational reconstruction of the sentences that belong to the language, are collectively referred to as ''intermediate symbols''.
   −
Combining the singleton set <math>\{ S \}\!</math> whose sole member is the initial symbol with the set <math>\mathfrak{Q}</math> that assembles together all of the intermediate symbols results in the set <math>\{ S \} \cup \mathfrak{Q}</math> of ''non-terminal symbols''.  Completing the package, the alphabet <math>\mathfrak{A}</math> of the language is also known as the set of ''terminal symbols''.  In this discussion, I will adopt the convention that <math>\mathfrak{Q}</math> is the set of ''intermediate symbols'', but I will often use <math>q\!</math> as a typical variable that ranges over all of the non-terminal symbols, <math>q \in \{ S \} \cup \mathfrak{Q}.</math>  Finally, it is convenient to refer to all of the symbols in <math>\{ S \} \cup \mathfrak{Q} \cup \mathfrak{A}</math> as the ''augmented alphabet'' of the prospective grammar for the language, and accordingly to describe the strings in <math>( \{ S \} \cup \mathfrak{Q} \cup \mathfrak{A} )^*</math> as the ''augmented strings'', in effect, expressing the forms that are superimposed on a language by one of its conceivable grammars.  In certain settings it becomes desirable to separate the augmented strings that contain the symbol <math>S\!</math> from all other sorts of augmented strings.  In these situations the strings in the disjoint union <math>\{ S \} \cup (\mathfrak{Q} \cup \mathfrak{A} )^*</math> are known as the ''sentential forms'' of the associated grammar.
+
Combining the singleton set <math>\{ ^{\backprime\backprime} S \, ^{\prime\prime} \}</math> whose sole member is the initial symbol with the set <math>\mathfrak{Q}</math> that assembles together all of the intermediate symbols results in the set <math>\{ ^{\backprime\backprime} S \, ^{\prime\prime} \} \cup \mathfrak{Q}</math> of ''non-terminal symbols''.  Completing the package, the alphabet <math>\mathfrak{A}</math> of the language is also known as the set of ''terminal symbols''.  In this discussion, I will adopt the convention that <math>\mathfrak{Q}</math> is the set of ''intermediate symbols'', but I will often use <math>q\!</math> as a typical variable that ranges over all of the non-terminal symbols, <math>q \in \{ ^{\backprime\backprime} S \, ^{\prime\prime} \} \cup \mathfrak{Q}.</math>  Finally, it is convenient to refer to all of the symbols in <math>\{ ^{\backprime\backprime} S \, ^{\prime\prime} \} \cup \mathfrak{Q} \cup \mathfrak{A}</math> as the ''augmented alphabet'' of the prospective grammar for the language, and accordingly to describe the strings in <math>( \{ ^{\backprime\backprime} S \, ^{\prime\prime} \} \cup \mathfrak{Q} \cup \mathfrak{A} )^*</math> as the ''augmented strings'', in effect, expressing the forms that are superimposed on a language by one of its conceivable grammars.  In certain settings it becomes desirable to separate the augmented strings that contain the symbol <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> from all other sorts of augmented strings.  In these situations the strings in the disjoint union <math>\{ ^{\backprime\backprime} S \, ^{\prime\prime} \} \cup (\mathfrak{Q} \cup \mathfrak{A} )^*</math> are known as the ''sentential forms'' of the associated grammar.
    
In forming a grammar for a language statements of the form <math>W :> W',\!</math>
 
In forming a grammar for a language statements of the form <math>W :> W',\!</math>
Line 547: Line 546:  
Employing the notion of a covering relation it becomes possible to redescribe the cactus language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P})</math> in the following ways.
 
Employing the notion of a covering relation it becomes possible to redescribe the cactus language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P})</math> in the following ways.
   −
===Grammar 1===
+
====Grammar 1====
   −
Grammar&nbsp;1 is something of a misnomer.  It is nowhere near exemplifying any kind of a standard form and it is only intended as a starting point for the initiation of more respectable grammars.  Such as it is, it uses the terminal alphabet <math>\mathfrak{A} = \mathfrak{M} \cup \mathfrak{P}</math> that comes with the territory of the cactus language <math>\mathfrak{C} (\mathfrak{P}),</math> it specifies <math>\mathfrak{Q} = \emptyset,</math> in other words, it employs no intermediate symbols, and it embodies the ''covering set'' <math>\mathfrak{K}</math> as listed in the following display.
+
Grammar&nbsp;1 is something of a misnomer.  It is nowhere near exemplifying any kind of a standard form and it is only intended as a starting point for the initiation of more respectable grammars.  Such as it is, it uses the terminal alphabet <math>\mathfrak{A} = \mathfrak{M} \cup \mathfrak{P}</math> that comes with the territory of the cactus language <math>\mathfrak{C} (\mathfrak{P}),\!</math> it specifies <math>\mathfrak{Q} = \varnothing,</math> in other words, it employs no intermediate symbols, and it embodies the ''covering set'' <math>\mathfrak{K}</math> as listed in the following display.
    
<br>
 
<br>
Line 557: Line 556:  
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 1}\!</math>
 
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 1}\!</math>
 
| align="right" style="border-right:1px solid black;" width="50%" |
 
| align="right" style="border-right:1px solid black;" width="50%" |
<math>\mathfrak{Q} = \emptyset</math>
+
<math>\mathfrak{Q} = \varnothing</math>
 
|-
 
|-
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
Line 621: Line 620:  
In responding to these issues, it is advisable at first to proceed in a stepwise fashion, all the better to accommodate the chances of pursuing a series of parallel developments in the grammar, to allow for the possibility of reversing many steps in its development, indeed, to take into account the near certain necessity of having to revisit, to revise, and to reverse many decisions about how to proceed toward an optimal description or a satisfactory grammar for the language.  Doing all this means exploring the effects of various alterations and innovations as independently from each other as possible.
 
In responding to these issues, it is advisable at first to proceed in a stepwise fashion, all the better to accommodate the chances of pursuing a series of parallel developments in the grammar, to allow for the possibility of reversing many steps in its development, indeed, to take into account the near certain necessity of having to revisit, to revise, and to reverse many decisions about how to proceed toward an optimal description or a satisfactory grammar for the language.  Doing all this means exploring the effects of various alterations and innovations as independently from each other as possible.
   −
The degree of intermediate organization in a grammar is measured by how many intermediate symbols it has and by how they interact with each other by means of its productions.  With respect to this issue, Grammar&nbsp;1 has no intermediate symbols at all, <math>\mathfrak{Q} = \emptyset,</math> and therefore remains at an ostensibly trivial degree of intermediate organization.  Some additions to the list of intermediate symbols are practically obligatory in order to arrive at any reasonable grammar at all, other inclusions appear to have a more optional character, though obviously useful from the standpoints of clarity and ease of comprehension.
+
The degree of intermediate organization in a grammar is measured by how many intermediate symbols it has and by how they interact with each other by means of its productions.  With respect to this issue, Grammar&nbsp;1 has no intermediate symbols at all, <math>\mathfrak{Q} = \varnothing,</math> and therefore remains at an ostensibly trivial degree of intermediate organization.  Some additions to the list of intermediate symbols are practically obligatory in order to arrive at any reasonable grammar at all, other inclusions appear to have a more optional character, though obviously useful from the standpoints of clarity and ease of comprehension.
    
One of the troubles that is perceived to affect Grammar&nbsp;1 is that it wastes so much of the available potential for efficient description in recounting over and over again the simple fact that the empty string is present in the language.  This arises in part from the statement that <math>S :> S^*,\!</math> which implies that:
 
One of the troubles that is perceived to affect Grammar&nbsp;1 is that it wastes so much of the available potential for efficient description in recounting over and over again the simple fact that the empty string is present in the language.  This arises in part from the statement that <math>S :> S^*,\!</math> which implies that:
Line 640: Line 639:  
|}
 
|}
   −
There is nothing wrong with the more expansive pan of the covered equation, since it follows straightforwardly from the definition of the kleene star operation, but the covering statement to the effect that <math>S :> S^*\!</math> is not a very productive piece of information, in the sense of telling very much about the language that falls under the type of a sentence <math>S.\!</math>  In particular, since it implies that <math>S :> \underline\varepsilon,</math> and since <math>\underline\varepsilon \cdot \mathfrak{L} \, = \, \mathfrak{L} \cdot \underline\varepsilon \, = \, \mathfrak{L},</math> for any formal language <math>\mathfrak{L},</math> the empty string <math>\varepsilon</math> is counted over and over in every term of the union, and every non-empty sentence under <math>S\!</math> appears again and again in every term of the union that follows the initial appearance of <math>S.\!</math>  As a result, this style of characterization has to be classified as ''true but not very informative''.  If at all possible, one prefers to partition the language of interest into a disjoint union of subsets, thereby accounting for each sentence under its proper term, and one whose place under the sum serves as a useful parameter of its character or its complexity.  In general, this form of description is not always possible to achieve, but it is usually worth the trouble to actualize it whenever it is.
+
There is nothing wrong with the more expansive pan of the covered equation, since it follows straightforwardly from the definition of the kleene star operation, but the covering statement to the effect that <math>S :> S^*\!</math> is not a very productive piece of information, in the sense of telling very much about the language that falls under the type of a sentence <math>S.\!</math>  In particular, since it implies that <math>S :> \underline\varepsilon,</math> and since <math>\underline\varepsilon \cdot \mathfrak{L} \, = \, \mathfrak{L} \cdot \underline\varepsilon \, = \, \mathfrak{L},</math> for any formal language <math>\mathfrak{L},</math> the empty string <math>\varepsilon\!</math> is counted over and over in every term of the union, and every non-empty sentence under <math>S\!</math> appears again and again in every term of the union that follows the initial appearance of <math>S.\!</math>  As a result, this style of characterization has to be classified as ''true but not very informative''.  If at all possible, one prefers to partition the language of interest into a disjoint union of subsets, thereby accounting for each sentence under its proper term, and one whose place under the sum serves as a useful parameter of its character or its complexity.  In general, this form of description is not always possible to achieve, but it is usually worth the trouble to actualize it whenever it is.
    
Suppose that one tries to deal with this problem by eliminating each use of the kleene star operation, by reducing it to a purely finitary set of steps, or by finding an alternative way to cover the sublanguage that it is used to generate.  This amounts, in effect, to ''recognizing a type'', a complex process that involves the following steps:
 
Suppose that one tries to deal with this problem by eliminating each use of the kleene star operation, by reducing it to a purely finitary set of steps, or by finding an alternative way to cover the sublanguage that it is used to generate.  This amounts, in effect, to ''recognizing a type'', a complex process that involves the following steps:
Line 650: Line 649:  
In sum, one introduces a non-terminal symbol for each type of sentence and each ''part of speech'' or sentential component that is generated by means of iteration or recursion under the ruling constraints of the grammar.  In order to do this one needs to analyze the iteration of each grammatical operation in a way that is analogous to a mathematically inductive definition, but further in a way that is not forced explicitly to recognize a distinct and separate type of expression merely to account for and to recount every increment in the parameter of iteration.
 
In sum, one introduces a non-terminal symbol for each type of sentence and each ''part of speech'' or sentential component that is generated by means of iteration or recursion under the ruling constraints of the grammar.  In order to do this one needs to analyze the iteration of each grammatical operation in a way that is analogous to a mathematically inductive definition, but further in a way that is not forced explicitly to recognize a distinct and separate type of expression merely to account for and to recount every increment in the parameter of iteration.
   −
Returning to the case of the cactus language, the process of recognizing an iterative type or a recursive type can be illustrated in the following way.  The operative phrases in the simplest sort of recursive definition are its ''initial part'' and its ''generic part''.  For the cactus language <math>\mathfrak{C} (\mathfrak{P}),</math> one has the following definitions of concatenation as iterated precatenation and of surcatenation as iterated subcatenation, respectively:
+
Returning to the case of the cactus language, the process of recognizing an iterative type or a recursive type can be illustrated in the following way.  The operative phrases in the simplest sort of recursive definition are its ''initial part'' and its ''generic part''.  For the cactus language <math>\mathfrak{C} (\mathfrak{P}),\!</math> one has the following definitions of concatenation as iterated precatenation and of surcatenation as iterated subcatenation, respectively:
    
{| align="center" cellpadding="8" width="90%"
 
{| align="center" cellpadding="8" width="90%"
Line 711: Line 710:  
# A grammar rule that invokes a notion of decatenation, deletion, erasure, or any other sort of retrograde production, is frequently considered to be lacking in elegance, and a there is a style of critique for grammars that holds it preferable to avoid these types of operations if it is at all possible to do so.  Accordingly, contingent on the prescriptions of the informal rule in question, and pursuing the stylistic dictates that are writ in the realm of its aesthetic regime, it becomes necessary for us to backtrack a little bit, to temporarily withdraw the suggestion of employing these elliptical types of operations, but without, of course, eliding the record of doing so.
 
# A grammar rule that invokes a notion of decatenation, deletion, erasure, or any other sort of retrograde production, is frequently considered to be lacking in elegance, and a there is a style of critique for grammars that holds it preferable to avoid these types of operations if it is at all possible to do so.  Accordingly, contingent on the prescriptions of the informal rule in question, and pursuing the stylistic dictates that are writ in the realm of its aesthetic regime, it becomes necessary for us to backtrack a little bit, to temporarily withdraw the suggestion of employing these elliptical types of operations, but without, of course, eliding the record of doing so.
   −
===Grammar 2===
+
====Grammar 2====
    
One way to analyze the surcatenation of any number of sentences is to introduce an auxiliary type of string, not in general a sentence, but a proper component of any sentence that is formed by surcatenation.  Doing this brings one to the following definition:
 
One way to analyze the surcatenation of any number of sentences is to introduce an auxiliary type of string, not in general a sentence, but a proper component of any sentence that is formed by surcatenation.  Doing this brings one to the following definition:
Line 745: Line 744:  
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 2}\!</math>
 
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 2}\!</math>
 
| align="right" style="border-right:1px solid black;" width="50%" |
 
| align="right" style="border-right:1px solid black;" width="50%" |
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} T ^{\prime\prime} \, \}</math>
+
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \}</math>
 
|-
 
|-
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
Line 791: Line 790:  
In this rendition, a string of type <math>T\!</math> is not in general a sentence itself but a proper ''part of speech'', that is, a strictly ''lesser'' component of a sentence in any suitable ordering of sentences and their components.  In order to see how the grammatical category <math>T\!</math> gets off the ground, that is, to detect its minimal strings and to discover how its ensuing generations get started from these, it is useful to observe that the covering rule <math>T :> S\!</math> means that <math>T\!</math> ''inherits'' all of the initial conditions of <math>S,\!</math> namely, <math>T \, :> \, \varepsilon, m_1, p_j.</math>  In accord with these simple beginnings it comes to parse that the rule <math>T \, :> \, T \, \cdot \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime} \, \cdot \, S,</math> with the substitutions <math>T = \varepsilon</math> and <math>S = \varepsilon</math> on the covered side of the rule, bears the germinal implication that <math>T \, :> \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime}.</math>
 
In this rendition, a string of type <math>T\!</math> is not in general a sentence itself but a proper ''part of speech'', that is, a strictly ''lesser'' component of a sentence in any suitable ordering of sentences and their components.  In order to see how the grammatical category <math>T\!</math> gets off the ground, that is, to detect its minimal strings and to discover how its ensuing generations get started from these, it is useful to observe that the covering rule <math>T :> S\!</math> means that <math>T\!</math> ''inherits'' all of the initial conditions of <math>S,\!</math> namely, <math>T \, :> \, \varepsilon, m_1, p_j.</math>  In accord with these simple beginnings it comes to parse that the rule <math>T \, :> \, T \, \cdot \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime} \, \cdot \, S,</math> with the substitutions <math>T = \varepsilon</math> and <math>S = \varepsilon</math> on the covered side of the rule, bears the germinal implication that <math>T \, :> \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime}.</math>
   −
Grammar&nbsp;2 achieves a portion of its success through a higher degree of intermediate organization.  Roughly speaking, the level of organization can be seen as reflected in the cardinality of the intermediate alphabet <math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} T ^{\prime\prime} \, \}</math> but it is clearly not explained by this simple circumstance alone, since it is taken for granted that the intermediate symbols serve a purpose, a purpose that is easily recognizable but that may not be so easy to pin down and to specify exactly.  Nevertheless, it is worth the trouble of exploring this aspect of organization and this direction of development a little further.
+
Grammar&nbsp;2 achieves a portion of its success through a higher degree of intermediate organization.  Roughly speaking, the level of organization can be seen as reflected in the cardinality of the intermediate alphabet <math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \}</math> but it is clearly not explained by this simple circumstance alone, since it is taken for granted that the intermediate symbols serve a purpose, a purpose that is easily recognizable but that may not be so easy to pin down and to specify exactly.  Nevertheless, it is worth the trouble of exploring this aspect of organization and this direction of development a little further.
   −
===Grammar 3===
+
====Grammar 3====
    
Although it is not strictly necessary to do so, it is possible to organize the materials of our developing grammar in a slightly better fashion by recognizing two recurrent types of strings that appear in the typical cactus expression. In doing this, one arrives at the following two definitions:
 
Although it is not strictly necessary to do so, it is possible to organize the materials of our developing grammar in a slightly better fashion by recognizing two recurrent types of strings that appear in the typical cactus expression. In doing this, one arrives at the following two definitions:
Line 803: Line 802:  
|}
 
|}
   −
When there is no possibility of confusion, the letter <math>^{\backprime\backprime} R ^{\prime\prime}</math> can be used either as a string variable that ranges over the set of runes or else as a type name for the class of runes.  The latter reading amounts to the enlistment of a fresh intermediate symbol, <math>^{\backprime\backprime} R ^{\prime\prime} \in \mathfrak{Q},</math> as a part of a new grammar for <math>\mathfrak{C} (\mathfrak{P}).</math>  In effect, <math>^{\backprime\backprime} R ^{\prime\prime}</math> affords a grammatical recognition for any rune that forms a part of a sentence in <math>\mathfrak{C} (\mathfrak{P}).</math>  In situations where these variant usages are likely to be confused, the types of strings can be indicated by means of expressions like <math>r <: R\!</math> and <math>W <: R.\!</math>
+
When there is no possibility of confusion, the letter <math>^{\backprime\backprime} R \, ^{\prime\prime}</math> can be used either as a string variable that ranges over the set of runes or else as a type name for the class of runes.  The latter reading amounts to the enlistment of a fresh intermediate symbol, <math>^{\backprime\backprime} R \, ^{\prime\prime} \in \mathfrak{Q},</math> as a part of a new grammar for <math>\mathfrak{C} (\mathfrak{P}).</math>  In effect, <math>^{\backprime\backprime} R \, ^{\prime\prime}</math> affords a grammatical recognition for any rune that forms a part of a sentence in <math>\mathfrak{C} (\mathfrak{P}).</math>  In situations where these variant usages are likely to be confused, the types of strings can be indicated by means of expressions like <math>r <: R\!</math> and <math>W <: R.\!</math>
   −
A ''foil'' is a string of the form <math>^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime},</math> where <math>T\!</math> is a tract.  Thus, a typical foil <math>F\!</math> has the form:
+
A ''foil'' is a string of the form <math>{}^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime},\!</math> where <math>T\!</math> is a tract.  Thus, a typical foil <math>F\!</math> has the form:
    
{| align="center" cellpadding="8" width="90%"
 
{| align="center" cellpadding="8" width="90%"
 
|
 
|
<math>\begin{array}{lllllllllllllll}
+
<math>\begin{array}{*{15}{l}}
 
F
 
F
 
& =
 
& =
Line 829: Line 828:  
|}
 
|}
   −
This is just the surcatenation of the sentences <math>S_1, \ldots, S_k.\!</math>  Given the possibility that this sequence of sentences is empty, and thus that the tract <math>T\!</math> is the empty string, the minimum foil <math>F\!</math> is the expression <math>^{\backprime\backprime} \, \operatorname{()} \, ^{\prime\prime}.</math>  Explicitly marking each foil <math>F\!</math> that is embodied in a cactus expression is tantamount to recognizing another intermediate symbol, <math>^{\backprime\backprime} F ^{\prime\prime} \in \mathfrak{Q},</math> further articulating the structures of sentences and expanding the grammar for the language
+
This is just the surcatenation of the sentences <math>S_1, \ldots, S_k.\!</math>  Given the possibility that this sequence of sentences is empty, and thus that the tract <math>T\!</math> is the empty string, the minimum foil <math>F\!</math> is the expression <math>^{\backprime\backprime} \, \operatorname{()} \, ^{\prime\prime}.</math>  Explicitly marking each foil <math>F\!</math> that is embodied in a cactus expression is tantamount to recognizing another intermediate symbol, <math>^{\backprime\backprime} F \, ^{\prime\prime} \in \mathfrak{Q},</math> further articulating the structures of sentences and expanding the grammar for the language <math>\mathfrak{C} (\mathfrak{P}).\!</math>  All of the same remarks about the versatile uses of the intermediate symbols, as string variables and as type names, apply again to the letter <math>^{\backprime\backprime} F \, ^{\prime\prime}.</math>
<math>\mathfrak{C} (\mathfrak{P}).</math>  All of the same remarks about the versatile uses of the intermediate symbols, as string variables and as type names, apply again to the letter <math>^{\backprime\backprime} F ^{\prime\prime}.</math>
      
<br>
 
<br>
Line 838: Line 836:  
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 3}\!</math>
 
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 3}\!</math>
 
| align="right" style="border-right:1px solid black;" width="50%" |
 
| align="right" style="border-right:1px solid black;" width="50%" |
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} F ^{\prime\prime}, \, ^{\backprime\backprime} R ^{\prime\prime}, \, ^{\backprime\backprime} T ^{\prime\prime} \, \}</math>
+
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} F \, ^{\prime\prime}, \, ^{\backprime\backprime} R \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \}</math>
 
|-
 
|-
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
 
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
Line 892: Line 890:  
& T \, \cdot \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime} \, \cdot \, S
 
& T \, \cdot \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime} \, \cdot \, S
 
\\
 
\\
\end{array}</math>
+
\end{array}\!</math>
 
|}
 
|}
    
<br>
 
<br>
   −
In Grammar&nbsp;3, the first three Rules say that a sentence (a string of type <math>S\!</math>), is a rune (a string of type <math>R\!</math>), a foil (a string of type <math>F\!</math>), or an arbitrary concatenation of strings of these two types.  Rules&nbsp;4 through 7 specify that a rune <math>R\!</math> is an empty string <math>\varepsilon,</math> a blank symbol <math>m_1,\!</math> a paint <math>p_j,\!</math> or any concatenation of strings of these three types.  Rule&nbsp;8 characterizes a foil <math>F\!</math> as a string of the form <math>^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime},</math> where <math>T\!</math> is a tract.  The last two Rules say that a tract <math>T\!</math> is either a sentence <math>S\!</math> or else the concatenation of a tract, a comma, and a sentence, in that order.
+
In Grammar&nbsp;3, the first three Rules say that a sentence (a string of type <math>S\!</math>), is a rune (a string of type <math>R\!</math>), a foil (a string of type <math>F\!</math>), or an arbitrary concatenation of strings of these two types.  Rules&nbsp;4 through 7 specify that a rune <math>R\!</math> is an empty string <math>\varepsilon,</math> a blank symbol <math>m_1,\!</math> a paint <math>p_j,\!</math> or any concatenation of strings of these three types.  Rule&nbsp;8 characterizes a foil <math>F\!</math> as a string of the form <math>{}^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime},\!</math> where <math>T\!</math> is a tract.  The last two Rules say that a tract <math>T\!</math> is either a sentence <math>S\!</math> or else the concatenation of a tract, a comma, and a sentence, in that order.
   −
At this point in the succession of grammars for <math>\mathfrak{C} (\mathfrak{P}),</math> the explicit uses of indefinite iterations, like the kleene star operator, are now completely reduced to finite forms of concatenation, but the problems that some styles of analysis have with allowing non-terminal symbols to cover both themselves and the empty string are still present.
+
At this point in the succession of grammars for <math>\mathfrak{C} (\mathfrak{P}),\!</math> the explicit uses of indefinite iterations, like the kleene star operator, are now completely reduced to finite forms of concatenation, but the problems that some styles of analysis have with allowing non-terminal symbols to cover both themselves and the empty string are still present.
    
Any degree of reflection on this difficulty raises the general question:  What is a practical strategy for accounting for the empty string in the organization of any formal language that counts it among its sentences?  One answer that presents itself is this:  If the empty string belongs to a formal language, it suffices to count it once at the beginning of the formal account that enumerates its sentences and then to move on to more interesting materials.
 
Any degree of reflection on this difficulty raises the general question:  What is a practical strategy for accounting for the empty string in the organization of any formal language that counts it among its sentences?  One answer that presents itself is this:  If the empty string belongs to a formal language, it suffices to count it once at the beginning of the formal account that enumerates its sentences and then to move on to more interesting materials.
   −
Returning to the case of the cactus language <math>\mathfrak{C} (\mathfrak{P}),</math> in other words, the formal language <math>\operatorname{PARCE}</math> of ''painted and rooted cactus expressions'', it serves the purpose of efficient accounting to partition the language into the following couple of sublanguages:
+
Returning to the case of the cactus language <math>\mathfrak{C} (\mathfrak{P}),\!</math> in other words, the formal language <math>\operatorname{PARCE}\!</math> of ''painted and rooted cactus expressions'', it serves the purpose of efficient accounting to partition the language into the following couple of sublanguages:
    
<ol style="list-style-type:decimal">
 
<ol style="list-style-type:decimal">
Line 919: Line 917:  
</ol>
 
</ol>
   −
<pre>
+
As a result of marking the distinction between empty and significant sentences, that is, by categorizing each of these three classes of strings as an entity unto itself and by conceptualizing the whole of its membership as falling under a distinctive symbol, one obtains an equation of sets that connects the three languages being marked:
As a result of marking the distinction between empty and significant sentences,
  −
that is, by categorizing each of these three classes of strings as an entity
  −
unto itself and by conceptualizing the whole of its membership as falling
  −
under a distinctive symbol, one obtains an equation of sets that connects
  −
the three languages being marked:
     −
SPARCE = PARCE - EPARCE.
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>\operatorname{SPARCE} \ = \ \operatorname{PARCE} \ - \ \operatorname{EPARCE}</math>
 +
|}
    
In sum, one has the disjoint union:
 
In sum, one has the disjoint union:
   −
PARCE   = EPARCE |_| SPARCE.
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>\operatorname{PARCE} \ = \ \operatorname{EPARCE} \ \cup \ \operatorname{SPARCE}</math>
 +
|}
   −
For brevity in the present case, and to serve as a generic device
+
For brevity in the present case, and to serve as a generic device in any similar array of situations, let <math>S\!</math> be the type of an arbitrary sentence, possibly empty, and let <math>S'\!</math> be the type of a specifically non-empty sentence.  In addition, let <math>\underline\varepsilon</math> be the type of the empty sentence, in effect, the language
in any similar array of situations, let the symbol "S" be used to
+
<math>\underline\varepsilon = \{ \varepsilon \}</math> that contains a single empty string, and let a plus sign <math>^{\backprime\backprime} + ^{\prime\prime}</math> signify a disjoint union of types.  In the most general type of situation, where the type <math>S\!</math> is permitted to include the empty string, one notes the following relation among types:
signify the type of an arbitrary sentence, possibly empty, whereas
  −
the symbol "S'" is reserved to designate the type of a specifically
  −
non-empty sentence.  In addition, let the symbol "%e%" be employed
  −
to indicate the type of the empty sentence, in effect, the language
  −
%e% = {""} that contains a single empty string, and let a plus sign
  −
"+" signify a disjoint union of types.  In the most general type of
  −
situation, where the type S is permitted to include the empty string,
  −
one notes the following relation among types:
     −
S = %e%  + S'.
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>S \ = \ \underline\varepsilon \ + \ S'</math>
 +
|}
   −
Consequences of the distinction between empty expressions and
+
With the distinction between empty and significant expressions in mind, I return to the grasp of the cactus language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P}) = \operatorname{PARCE} (\mathfrak{P})</math> that is afforded by Grammar&nbsp;2, and, taking that as a point of departure, explore other avenues of possible improvement in the comprehension of these expressions.  In order to observe the effects of this alteration as clearly as possible, in isolation from any other potential factors, it is useful to strip away the higher levels intermediate organization that are present in Grammar&nbsp;3, and start again with a single intermediate symbol, as used in Grammar&nbsp;2.  One way of carrying out this strategy leads on to a grammar of the variety that will be articulated next.
significant expressions are taken up for discussion next time.
     −
With the distinction between empty and significant expressions in mind,
+
====Grammar 4====
I return to the grasp of the cactus language !L! = !C!(!P!) = PARCE(!P!)
  −
that is afforded by Grammar 2, and, taking that as a point of departure,
  −
explore other avenues of possible improvement in the comprehension of
  −
these expressions.  In order to observe the effects of this alteration
  −
as clearly as possible, in isolation from any other potential factors,
  −
it is useful to strip away the higher levels intermediate organization
  −
that are present in Grammar 3, and start again with a single intermediate
  −
symbol, as used in Grammar 2.  One way of carrying out this strategy leads
  −
on to a grammar of the variety that will be articulated next.
     −
If one imposes the distinction between empty and significant types on
+
If one imposes the distinction between empty and significant types on each non-terminal symbol in Grammar&nbsp;2, then the non-terminal symbols <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> and <math>^{\backprime\backprime} T \, ^{\prime\prime}</math> give rise to the expanded set of non-terminal symbols <math>^{\backprime\backprime} S \, ^{\prime\prime}, \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime}, \, ^{\backprime\backprime} T' \, ^{\prime\prime},</math> leaving the last three of these to form the new intermediate alphabet. Grammar&nbsp;4 has the intermediate alphabet <math>\mathfrak{Q} \, = \, \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime}, \, ^{\backprime\backprime} T' \, ^{\prime\prime} \, \},</math> with the set <math>\mathfrak{K}</math> of covering rules as listed in the next display.
each non-terminal symbol in Grammar 2, then the non-terminal symbols
  −
"S" and "T" give rise to the non-terminal symbols "S", "S'", "T", "T'",
  −
leaving the last three of these to form the new intermediate alphabet.
  −
Grammar 4 has the intermediate alphabet !Q! = {"S'", "T", "T'"}, with
  −
the set !K! of covering production rules as listed in the next display.
  −
</pre>
     −
===Grammar 4===
+
<br>
   −
<pre>
+
{| align="center" cellpadding="12" cellspacing="0" style="border-top:1px solid black" width="90%"
| !C!(!P!)Grammar 4
+
| align="left"  style="border-left:1px solid black;"  width="50%" |
|
+
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 4}\!</math>
| !Q! = {"S'", "T", "T'"}
+
| align="right" style="border-right:1px solid black;" width="50%" |
|
+
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime}, \, ^{\backprime\backprime} T' \, ^{\prime\prime} \, \}</math>
| 1. S   :> !e!
+
|-
|
+
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
| 2. S   :> S'
+
<math>\begin{array}{rcll}
|
+
1.
| 3. S' :> m_1
+
& S
|
+
& :>
| 4. S' :> p_j, for each j in J
+
& \varepsilon
|
+
\\
| 5. S' :> "-(" · T · ")-"
+
2.
|
+
& S
| 6. S' :> S' · S'
+
& :>
|
+
& S'
| 7. T   :> !e!
+
\\
|
+
3.
| 8. T   :> T'
+
& S'
|
+
& :>
| 9. T' :> T · "," · S
+
& m_1
 
+
\\
In this version of a grammar for !L! = !C!(!P!), the intermediate type T
+
4.
is partitioned as T = %e% + T', thereby parsing the intermediate symbol T
+
& S'
in parallel fashion with the division of its overlying type as S = %e% + S'.
+
& :>
This is an option that I will choose to close off for now, but leave it open
+
& p_j, \, \text{for each} \, j \in J
to consider at a later point.  Thus, it suffices to give a brief discussion
+
\\
of what it involves, in the process of moving on to its chief alternative.
+
5.
 
+
& S'
There does not appear to be anything radically wrong with trying this
+
& :>
approach to types.  It is reasonable and consistent in its underlying
+
& ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}
principle, and it provides a rational and a homogeneous strategy toward
+
\\
all parts of speech, but it does require an extra amount of conceptual
+
6.
overhead, in that every non-trivial type has to be split into two parts
+
& S'
and comprehended in two stages.  Consequently, in view of the largely
+
& :>
practical difficulties of making the requisite distinctions for every
+
& S' \, \cdot \, S'
intermediate symbol, it is a common convention, whenever possible, to
+
\\
restrict intermediate types to covering exclusively non-empty strings.
+
7.
 
+
& T
For the sake of future reference, it is convenient to refer to this restriction
+
& :>
on intermediate symbols as the "intermediate significance" constraint.  It can
+
& \varepsilon
be stated in a compact form as a condition on the relations between non-terminal
+
\\
symbols q in {"S"} |_| !Q! and sentential forms W in {"S"} |_| (!Q! |_| !A!)*.
+
8.
 
+
& T
| Condition On Intermediate Significance
+
& :>
|
+
& T'
| If    q  :>  W
+
\\
|
+
9.
| and  W  =  !e!,
+
& T'
|
+
& :>
| then  q  =  "S".
+
& T \, \cdot \, ^{\backprime\backprime} \operatorname{,} ^{\prime\prime} \, \cdot \, S
 
+
\\
If this is beginning to sound like a monotone condition, then it is
+
\end{array}</math>
not absurd to sharpen the resemblance and render the likeness more
+
|}
acute.  This is done by declaring a couple of ordering relations,
  −
denoting them under variant interpretations by the same sign "<".
  −
 
  −
1.  The ordering "<" on the set of non-terminal symbols,
  −
    q in {"S"} |_| !Q!, ordains the initial symbol "S"
  −
    to be strictly prior to every intermediate symbol.
  −
    This is tantamount to the axiom that "S" < q,
  −
    for all q in !Q!.
  −
 
  −
2.  The ordering "<" on the collection of sentential forms,
  −
    W in {"S"} |_| (!Q! |_| !A!)*, ordains the empty string
  −
    to be strictly minor to every other sentential form.
  −
    This is stipulated in the axiom that !e! < W,
  −
    for every non-empty sentential form W.
  −
 
  −
Given these two orderings, the constraint in question
  −
on intermediate significance can be stated as follows:
  −
 
  −
| Condition Of Intermediate Significance
  −
|
  −
| If    q  :> W
  −
|
  −
| and  q  >  "S",
  −
|
  −
| then  W  >  !e!.
     −
Achieving a grammar that respects this convention typically requires a more
+
<br>
detailed account of the initial setting of a type, both with regard to the
  −
type of context that incites its appearance and also with respect to the
  −
minimal strings that arise under the type in question.  In order to find
  −
covering productions that satisfy the intermediate significance condition,
  −
one must be prepared to consider a wider variety of calling contexts or
  −
inciting situations that can be noted to surround each recognized type,
  −
and also to enumerate a larger number of the smallest cases that can
  −
be observed to fall under each significant type.
     −
With the array of foregoing considerations in mind,
+
In this version of a grammar for <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P}),</math> the intermediate type <math>T\!</math> is partitioned as <math>T = \underline\varepsilon + T',</math> thereby parsing the intermediate symbol <math>T\!</math> in parallel fashion with the division of its overlying type as <math>S = \underline\varepsilon + S'.</math>  This is an option that I will choose to close off for now, but leave it open to consider at a later point.  Thus, it suffices to give a brief discussion of what it involves, in the process of moving on to its chief alternative.
one is gradually led to a grammar for !L! = !C!(!P!)
  −
in which all of the covering productions have either
  −
one of the following two forms:
     −
| S  :>  !e!
+
There does not appear to be anything radically wrong with trying this approach to types. It is reasonable and consistent in its underlying principle, and it provides a rational and a homogeneous strategy toward all parts of speech, but it does require an extra amount of conceptual overhead, in that every non-trivial type has to be split into two parts and comprehended in two stages. Consequently, in view of the largely practical difficulties of making the requisite distinctions for every intermediate symbol, it is a common convention, whenever possible, to restrict intermediate types to covering exclusively non-empty strings.
|
  −
| q  :>  W,  with  q in {"S"} |_| !Q!, and  W in (!Q! |_| !A!)^+
     −
A grammar that fits into this mold is called a "context-free" grammar.
+
For the sake of future reference, it is convenient to refer to this restriction on intermediate symbols as the ''intermediate significance'' constraintIt can be stated in a compact form as a condition on the relations between non-terminal symbols <math>q \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q}</math> and sentential forms <math>W \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup (\mathfrak{Q} \cup \mathfrak{A})^*.</math>
The first type of rewrite rule is referred to as a "special production",
  −
while the second type of rewrite rule is called an "ordinary production".
  −
An "ordinary derivation" is one that employs only ordinary productions.
  −
In ordinary productions, those that have the form q :> W, the replacement
  −
string W is never the empty string, and so the lengths of the augmented
  −
strings or the sentential forms that follow one another in an ordinary
  −
derivation, on account of using the ordinary types of rewrite rules,
  −
never decrease at any stage of the process, up to and including the
  −
terminal string that is finally generated by the grammarThis type
  −
of feature is known as the "non-contracting property" of productions,
  −
derivations, and grammars.  A grammar is said to have the property if
  −
all of its covering productions, with the possible exception of S :> e,
  −
are non-contracting.  In particular, context-free grammars are special
  −
cases of non-contracting grammars.  The presence of the non-contracting
  −
property within a formal grammar makes the length of the augmented string
  −
available as a parameter that can figure into mathematical inductions and
  −
motivate recursive proofs, and this handle on the generative process makes
  −
it possible to establish the kinds of results about the generated language
  −
that are not easy to achieve in more general cases, nor by any other means
  −
even in these brands of special cases.
     −
Grammar 5 is a context-free grammar for the painted cactus language
+
<br>
that uses !Q! = {"S'", "T"}, with !K! as listed in the next display.
  −
</pre>
     −
===Grammar 5===
+
{| align="center" cellpadding="12" cellspacing="0" style="border-top:1px solid black" width="90%"
 
+
| align="center" style="border-left:1px solid black; border-right:1px solid black" |
<pre>
+
<math>\text{Condition On Intermediate Significance}\!</math>
| !C!(!P!).  Grammar 5
+
|-
|
+
| style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
| !Q! = {"S'", "T"}
+
<math>\begin{array}{lccc}
|
+
\text{If}
|  1. S  :> !e!
+
& q
|
+
& :>
|  2.  S   :> S'
+
& W
|
+
\\
|  3.  S'  :> m_1
+
\text{and}
|
+
& W
|  4S'  :> p_j, for each j in J
+
& =
|
+
& \varepsilon
|  5.  S'  :> S' · S'
+
\\
|
+
\text{then}
|  6.  S'  :> "-()-"
+
& q
|
+
& =
|  7. S'  :> "-(" · T · ")-"
+
& ^{\backprime\backprime} S \, ^{\prime\prime}
|
+
\\
|  8.  T  :> ","
+
\end{array}</math>
|
+
|}
|  9.  T  :> S'
+
 
|
+
<br>
| 10T  :> T · ","
+
 
|
+
If this is beginning to sound like a monotone condition, then it is not absurd to sharpen the resemblance and render the likeness more acuteThis is done by declaring a couple of ordering relations, denoting them under variant interpretations by the same sign, <math>^{\backprime\backprime}\!< \, ^{\prime\prime}.</math>
| 11. T  :> T · "," · S'
+
 
 +
# The ordering <math>^{\backprime\backprime}\!< \, ^{\prime\prime}</math> on the set of non-terminal symbols, <math>q \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q},</math> ordains the initial symbol <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> to be strictly prior to every intermediate symbolThis is tantamount to the axiom that <math>^{\backprime\backprime} S \, ^{\prime\prime} < q,</math> for all <math>q \in \mathfrak{Q}.</math>
 +
# The ordering <math>^{\backprime\backprime}\!< \, ^{\prime\prime}</math> on the collection of sentential forms, <math>W \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup (\mathfrak{Q} \cup \mathfrak{A})^*,</math> ordains the empty string to be strictly minor to every other sentential formThis is stipulated in the axiom that <math>\varepsilon < W,</math> for every non-empty sentential form <math>W.\!</math>
   −
Finally, it is worth trying to bring together the advantages of these
+
Given these two orderings, the constraint in question on intermediate significance can be stated as follows:
diverse styles of grammar, to whatever extent that they are compatible.
  −
To do this, a prospective grammar must be capable of maintaining a high
  −
level of intermediate organization, like that arrived at in Grammar 2,
  −
while respecting the principle of intermediate significance, and thus
  −
accumulating all the benefits of the context-free format in Grammar 5.
  −
A plausible synthesis of most of these features is given in Grammar 6.
  −
</pre>
     −
===Grammar 6===
+
<br>
   −
<pre>
+
{| align="center" cellpadding="12" cellspacing="0" style="border-top:1px solid black" width="90%"
| !C!(!P!).  Grammar 6
+
| align="center" style="border-left:1px solid black; border-right:1px solid black" |
|
+
<math>\text{Condition On Intermediate Significance}\!</math>
| !Q! = {"S'", "R", "F", "T"}
+
|-
|
+
| style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
|  1.  S  :> !e!
+
<math>\begin{array}{lccc}
|
+
\text{If}
| 2.  S  :>  S'
+
& q
|
+
& :>
|  3.  S'  :> R
+
& W
|
+
\\
|  4.  S'  :>  F
+
\text{and}
|
+
& q
|  5.  S'  :> S' · S'
+
& >
|
+
& ^{\backprime\backprime} S \, ^{\prime\prime}
|  6.  R  :>  m_1
+
\\
|
+
\text{then}
|  7.  R  :>  p_j, for each j in J
+
& W
|
+
& >
|  8.  R  :> R · R
+
& \varepsilon
|
+
\\
|  9.  F  :>  "-()-"
+
\end{array}</math>
|
+
|}
| 10.  F  :>  "-(" · T · ")-"
  −
|
  −
| 11.  T  :> ","
  −
|
  −
| 12.  T  :>  S'
  −
|
  −
| 13.  T  :> T · ","
  −
|
  −
| 14.  T  :>  T · "," · S'
     −
The preceding development provides a typical example of how an initially
+
<br>
effective and conceptually succinct description of a formal language, but
  −
one that is terse to the point of allowing its prospective interpreter to
  −
waste exorbitant amounts of energy in trying to unravel its implications,
  −
can be converted into a form that is more efficient from the operational
  −
point of view, even if slightly more ungainly in regard to its elegance.
     −
The basic idea behind all of this machinery remains the same: Besides
+
Achieving a grammar that respects this convention typically requires a more detailed account of the initial setting of a type, both with regard to the type of context that incites its appearance and also with respect to the minimal strings that arise under the type in question. In order to find covering productions that satisfy the intermediate significance condition, one must be prepared to consider a wider variety of calling contexts or inciting situations that can be noted to surround each recognized type, and also to enumerate a larger number of the smallest cases that can be observed to fall under each significant type.
the select body of formulas that are introduced as boundary conditions,
  −
it merely institutes the following general rule:
     −
| If    the strings S_1, ..., S_k are sentences,
+
====Grammar 5====
|
  −
| then  their concatenation in the form
  −
|
  −
|      Conc^k_j S_j  = S_1 · ... · S_k
  −
|
  −
|      is a sentence,
  −
|
  −
| and  their surcatenation in the form
  −
|
  −
|      Surc^k_j S_j  = "-(" · S_1 · "," · ... · "," · S_k · ")-"
  −
|
  −
|      is a sentence.
     −
It is fitting to wrap up the foregoing developments by summarizing the
+
With the foregoing array of considerations in mind, one is gradually led to a grammar for <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P})</math> in which all of the covering productions have either one of the following two forms:
notion of a formal grammar that appeared to evolve in the present case.
  −
For the sake of future reference and the chance of a wider application,
  −
it is also useful to try to extract the scheme of a formalization that
  −
potentially holds for any formal language.  The following presentation
  −
of the notion of a formal grammar is adapted, with minor modifications,
  −
from the treatment in (DDQ, 60-61).
     −
A "formal grammar" !G! is given by a four-tuple !G! = ("S", !Q!, !A!, !K!)
+
{| align="center" cellpadding="8" width="90%"
that takes the following form of description:
+
|
 
+
<math>\begin{array}{ccll}
1.  "S" is the "initial", "special", "start", or "sentence symbol".
+
S
    Since the letter "S" serves this function only in a special setting,
+
& :>
    its employment in this role need not create any confusion with its
+
& \varepsilon
    other typical uses as a string variable or as a sentence variable.
+
&
 +
\\
 +
q
 +
& :>
 +
& W,
 +
& \text{with} \ q \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q} \ \text{and} \ W \in (\mathfrak{Q} \cup \mathfrak{A})^+
 +
\\
 +
\end{array}</math>
 +
|}
   −
2.  !Q! = {q_1, ..., q_m} is a finite set of "intermediate symbols",
+
A grammar that fits into this mold is called a ''context-free grammar''The first type of rewrite rule is referred to as a ''special production'', while the second type of rewrite rule is called an ''ordinary production''.  An ''ordinary derivation'' is one that employs only ordinary productions.  In ordinary productions, those that have the form <math>q :> W,\!</math> the replacement string <math>W\!</math> is never the empty string, and so the lengths of the augmented strings or the sentential forms that follow one another in an ordinary derivation, on account of using the ordinary types of rewrite rules, never decrease at any stage of the process, up to and including the terminal string that is finally generated by the grammar. This type of feature is known as the ''non-contracting property'' of productions, derivations, and grammars. A grammar is said to have the property if all of its covering productions, with the possible exception of <math>S :> \varepsilon,</math> are non-contracting. In particular, context-free grammars are special cases of non-contracting grammars.  The presence of the non-contracting property within a formal grammar makes the length of the augmented string available as a parameter that can figure into mathematical inductions and motivate recursive proofs, and this handle on the generative process makes it possible to establish the kinds of results about the generated language that are not easy to achieve in more general cases, nor by any other means even in these brands of special cases.
    all distinct from "S".
     −
3.  !A! = {a_1, ..., a_n} is a finite set of "terminal symbols",
+
Grammar&nbsp;5 is a context-free grammar for the painted cactus language that uses <math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \},</math> with <math>\mathfrak{K}</math> as listed in the next display.
    also known as the "alphabet" of !G!, all distinct from "S" and
  −
    disjoint from !Q!.  Depending on the particular conception of the
  −
    language !L! that is "covered", "generated", "governed", or "ruled"
  −
    by the grammar !G!, that is, whether !L! is conceived to be a set of
  −
    words, sentences, paragraphs, or more extended structures of discourse,
  −
    it is usual to describe !A! as the "alphabet", "lexicon", "vocabulary",
  −
    "liturgy", or "phrase book" of both the grammar !G! and the language !L!
  −
    that it regulates.
     −
4.  !K! is a finite set of "characterizations".  Depending on how they
+
<br>
    come into play, these are variously described as "covering rules",
  −
    "formations", "productions", "rewrite rules", "subsumptions",
  −
    "transformations", or "typing rules".
     −
To describe the elements of !K! it helps to define some additional terms:
+
{| align="center" cellpadding="12" cellspacing="0" style="border-top:1px solid black" width="90%"
 
+
| align="leftstyle="border-left:1px solid black;width="50%" |
a.  The symbols in {"S"} |_| !Q! |_| !A! form the "augmented alphabet" of !G!.
+
<math>\mathfrak{C} (\mathfrak{P}) : \text{Grammar 5}\!</math>
 
+
| align="right" style="border-right:1px solid black;" width="50%" |
b.  The symbols in {"S"} |_| !Q! are the "non-terminal symbols" of !G!.
+
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \}</math>
 
+
|-
c.  The symbols in !Q! |_| !A! are the "non-initial symbols" of !G!.
+
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
 
+
<math>\begin{array}{rcll}
d. The strings in ({"S"} |_| !Q! |_| !A!)* are the "augmented strings" for G.
+
1.
 
+
& S
e.  The strings in {"S"} |_| (!Q! |_| !A!)* are the "sentential forms" for G.
+
& :>
 
+
& \varepsilon
Each characterization in !K! is an ordered pair of strings (S_1, S_2)
+
\\
that takes the following form:
+
2.
 
+
& S
| S_1  = Q_1 · q · Q_2,
+
& :>
|  
+
& S'
| S_2  = Q_1 · W · Q_2.
+
\\
 
+
3.
In this scheme, S_1 and S_2 are members of the augmented strings for !G!,
+
& S'
more precisely, S_1 is a non-empty string and a sentential form over !G!,
+
& :>
while S_2 is a possibly empty string and also a sentential form over !G!.
+
& m_1
 
+
\\
Here also, q is a non-terminal symbol, that is, q is in {"S"} |_| !Q!,
+
4.
while Q_1, Q_2, and W are possibly empty strings of non-initial symbols,
+
& S'
a fact that can be expressed in the form:  Q_1, Q_2, W in (!Q! |_| !A!)*.
+
& :>
 
+
& p_j, \, \text{for each} \, j \in J
In practice, the ordered pairs of strings in !K! are used to "derive",
+
\\
to "generate", or to "produce" sentences of the language !L! = <!G!>
+
5.
that is then said to be "governed" or "regulated" by the grammar !G!.
+
& S'
In order to facilitate this active employment of the grammar, it is
+
& :>
conventional to write the characterization (S_1, S_2) in either one
+
& S' \, \cdot \, S'
of the next two forms, where the more generic form is followed by
+
\\
the more specific form:
+
6.
 
+
& S'
| S_1            :>   S_2
+
& :>
|
+
& ^{\backprime\backprime} \, \operatorname{()} \, ^{\prime\prime}
| Q_1 · q · Q_2  :>   Q_1 · W · Q_2
+
\\
 
+
7.
In this usage, the characterization S_1 :> S_2 is tantamount to a grammatical
+
& S'
license to transform a string of the form Q_1 · q · Q_2 into a string of the
+
& :>
form Q1 · W · Q2, in effect, replacing the non-terminal symbol q with the
+
& ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}
non-initial string W in any selected, preserved, and closely adjoining
+
\\
context of the form Q1 · ... · Q2.  Accordingly, in this application
+
8.
the notation "S_1 :> S_2" can be read as "S_1 produces S_2" or as
+
& T
"S_1 transforms into S_2".
+
& :>
 
+
& ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime}
An "immediate derivation" in !G! is an ordered pair (W, W')
+
\\
of sentential forms in !G! such that:
+
9.
 
+
& T
| W  =  Q_1 · X · Q_2,
+
& :>
|
+
& S'
| W' =  Q_1 · Y · Q_2,
+
\\
|
+
10.
| and  (X, Y)   in !K!,
+
& T
|
+
& :>
| i.e.  X :> Y  in !G!.
+
& T \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime}
 
+
\\
This relation is indicated by saying that W "immediately derives" W',
+
11.
that W' is "immediately derived" from W in !G!, and also by writing:
+
& T
 
+
& :>
W  ::> W'.
+
& T \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, S'
 
+
\\
A "derivation" in !G! is a finite sequence (W_1, ..., W_k)
+
\end{array}</math>
of sentential forms over !G! such that each adjacent pair
+
|}
(W_j, W_(j+1)) of sentential forms in the sequence is an
  −
immediate derivation in !G!, in other words, such that:
  −
 
  −
W_j  ::> W_(j+1), for all j = 1 to k-1.
  −
 
  −
If there exists a derivation (W_1, ..., W_k) in !G!,
  −
one says that W_1 "derives" W_k in !G!, conversely,
  −
that W_k is "derivable" from W_1 in !G!, and one
  −
typically summarizes the derivation by writing:
  −
 
  −
W_1  :*:>  W_k.
     −
The language !L! = !L!(!G!) = <!G!> that is "generated"
+
<br>
by the formal grammar !G! = ("S", !Q!, !A!, !K!) is the
  −
set of strings over the terminal alphabet !A! that are
  −
derivable from the initial symbol "S" by way of the
  −
intermediate symbols in !Q! according to the
  −
characterizations in K.  In sum:
     −
!L!(!G!) = <!G!>  =  {W in !A!*  :  "S" :*:> W}.
+
Finally, it is worth trying to bring together the advantages of these diverse styles of grammar, to whatever extent that they are compatible. To do this, a prospective grammar must be capable of maintaining a high level of intermediate organization, like that arrived at in Grammar&nbsp;2, while respecting the principle of intermediate significance, and thus accumulating all the benefits of the context-free format in Grammar&nbsp;5. A plausible synthesis of most of these features is given in Grammar&nbsp;6.
   −
Finally, a string W is called a "word", a "sentence", or so on,
+
====Grammar 6====
of the language generated by !G! if and only if W is in !L!(!G!).
     −
Reference
+
Grammar&nbsp;6 has the intermediate alphabet <math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} F \, ^{\prime\prime}, \, ^{\backprime\backprime} R \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \},</math> with the production set <math>\mathfrak{K}</math> as listed in the next display.
   −
| Denning, P.J., Dennis, J.B., Qualitz, J.E.,
+
<br>
|'Machines, Languages, and Computation',
  −
| Prentice-Hall, Englewood Cliffs, NJ, 1978.
  −
</pre>
     −
==The Cactus Language : Stylistics==
+
{| align="center" cellpadding="12" cellspacing="0" style="border-top:1px solid black" width="90%"
 
+
| align="left" style="border-left:1px solid black;"  width="50%" |
{| align="center" cellpadding="0" cellspacing="0" width="90%"
+
<math>{\mathfrak{C} (\mathfrak{P}) : \text{Grammar 6}}\!</math>
|
+
| align="right" style="border-right:1px solid black;" width="50%" |
<p>As a result, we can hardly conceive of how many possibilities there are for what we call objective reality.  Our sharp quills of knowledge are so narrow and so concentrated in particular directions that with science there are myriads of totally different real worlds, each one accessible from the next simply by slight alterations &mdash; shifts of gaze &mdash; of every particular discipline and subspecialty.
+
<math>\mathfrak{Q} = \{ \, ^{\backprime\backprime} S' \, ^{\prime\prime}, \, ^{\backprime\backprime} F \, ^{\prime\prime}, \, ^{\backprime\backprime} R \, ^{\prime\prime}, \, ^{\backprime\backprime} T \, ^{\prime\prime} \, \}\!</math>
</p>
   
|-
 
|-
| align="right" | &mdash; Herbert J. Bernstein, "Idols of Modern Science", [HJB, 38]
+
| colspan="2" style="border-top:1px solid black; border-bottom:1px solid black; border-left:1px solid black; border-right:1px solid black" |
|}
+
<math>\begin{array}{rcll}
 
+
1.
This Subsection highlights an issue of ''style'' that arises in describing a formal language. In broad terms, I use the word ''style'' to refer to a loosely specified class of formal systems, typically ones that have a set of distinctive features in common. For instance, a style of proof system usually dictates one or more rules of inference that are acknowledged as conforming to that style.  In the present context, the word ''style'' is a natural choice to characterize the varieties of formal grammars, or any other sorts of formal systems that can be contemplated for deriving the sentences of a formal language.
+
& S
 
+
& :>
In looking at what seems like an incidental issue, the discussion arrives at a critical point.  The question is:  What decides the issue of style?  Taking a given language as the object of discussion, what factors enter into and determine the choice of a style for its presentation, that is, a particular way of arranging and selecting the materials that come to be involved in a description, a grammar, or a theory of the language?  To what degree is the determination accidental, empirical, pragmatic, rhetorical, or stylistic, and to what extent is the choice essential, logical, and necessary?  For that matter, what determines the order of signs in a word, a sentence, a text, or a discussion?  All of the corresponding parallel questions about the character of this choice can be posed with regard to the constituent part as well as with regard to the main constitution of the formal language.
+
& \varepsilon
 
+
\\
In order to answer this sort of question, at any level of articulation, one has to inquire into the type of distinction that it invokes, between arrangements and orders that are essential, logical, and necessary and orders and arrangements that are accidental, rhetorical, and stylistic. As a rough guide to its comprehension, a ''logical order'', if it resides in the subject at all, can be approached by considering all of the ways of saying the same things, in all of the languages that are capable of saying roughly the same things about that subject.  Of course, the ''all'' that appears in this rule of thumb has to be interpreted as a fittingly qualified sort of universal.  For all practical purposes, it simply means ''all of the ways that a person can think of'' and ''all of the languages that a person can conceive of'', with all things being relative to the particular moment of investigation.  For all of these reasons, the rule must stand as little more than a rough idea of how to approach its object.
+
2.
 
+
& S
If it is demonstrated that a given formal language can be presented in any one of several styles of formal grammar, then the choice of a format is accidental, optional, and stylistic to the very extent that it is free.  But if it can be shown that a particular language cannot be successfully presented in a particular style of grammar, then the issue of style is no longer free and rhetorical, but becomes to that very degree essential, necessary, and obligatory, in other words, a question of the objective logical order that can be found to reside in the object language.
+
& :>
 
+
& S'
As a rough illustration of the difference between logical and rhetorical orders, consider the kinds of order that are expressed and exhibited in the following conjunction of implications:
+
\\
 
+
3.
: <math>X \Rightarrow Y\ \operatorname{and}\ Y \Rightarrow Z.</math>
+
& S'
 
+
& :>
Here, there is a happy conformity between the logical content and the rhetorical form, indeed, to such a degree that one hardly notices the difference between them. The rhetorical form is given by the order of sentences in the two implications and the order of implications in the conjunction.  The logical content is given by the order of propositions in the extended implicational sequence:
+
& R
 
+
\\
: <math>X\ \le\ Y\ \le\ Z.</math>
+
4.
 
+
& S'
To see the difference between form and content, or manner and matter, it is enough to observe a few of the ways that the expression can be varied without changing its meaning, for example:
+
& :>
 
+
& F
: <math>Z \Leftarrow Y\ \operatorname{and}\ Y \Leftarrow X.</math>
+
\\
 
+
5.
Any style of declarative programming, also called ''logic programming'', depends on a capacity, as embodied in a programming language or other formal system, to describe the relation between problems and solutions in logical terms.  A recurring problem in building this capacity is in bridging the gap between ostensibly non-logical orders and the logical orders that are used to describe and to represent them.  For instance, to mention just a couple of the most pressing cases, and the ones that are currently proving to be the most resistant to a complete analysis, one has the orders of dynamic evolution and rhetorical transition that manifest themselves in the process of inquiry and in the communication of its results.
+
& S'
 
+
& :>
This patch of the ongoing discussion is concerned with describing a particular variety of formal languages, whose typical representative is the painted cactus language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P}).</math>  It is the intention of this work to interpret this language for propositional logic, and thus to use it as a sentential calculus, an order of reasoning that forms an active ingredient and a significant component of all logical reasoning.  To describe this language, the standard devices of formal grammars and formal language theory are more than adequate, but this only raises the next question:  What sorts of devices are exactly adequate, and fit the task to a "T"?  The ultimate desire is to turn the tables on the order of description, and so begins a process of eversion that evolves to the point of asking:  To what extent can the language capture the essential features and laws of its own grammar and describe the active principles of its own generation?  In other words:  How well can the language be described by using the language itself to do so?
+
& S' \, \cdot \, S'
 
+
\\
In order to speak to these questions, I have to express what a grammar says about a language in terms of what a language can say on its own.  In effect, it is necessary to analyze the kinds of meaningful statements that grammars are capable of making about languages in general and to relate them to the kinds of meaningful statements that the syntactic ''sentences'' of the cactus language might be interpreted as making about the very same topics.  So far in the present discussion, the sentences of the cactus language do not make any meaningful statements at all, much less any meaningful statements about languages and their constitutions.  As of yet, these sentences subsist in the form of purely abstract, formal, and uninterpreted combinatorial constructions.
+
6.
 
+
& R
Before the capacity of a language to describe itself can be evaluated, the missing link to meaning has to be supplied for each of its strings.  This calls for a dimension of semantics and a notion of interpretation, topics that are taken up for the case of the cactus language <math>\mathfrak{C} (\mathfrak{P})</math> in Subsection 1.3.10.12.  Once a plausible semantics is prescribed for this language it will be possible to return to these questions and to address them in a meaningful way.
+
& :>
 
+
& m_1
The prominent issue at this point is the distinct placements of formal languages and formal grammars with respect to the question of meaning.  The sentences of a formal language are merely the abstract strings of abstract signs that happen to belong to a certain set.  They do not by themselves make any meaningful statements at all, not without mounting a separate effort of interpretation, but the rules of a formal grammar make meaningful statements about a formal language, to the extent that they say what strings belong to it and what strings do not.  Thus, the formal grammar, a formalism that appears to be even more skeletal than the formal language, still has bits and pieces of meaning attached to it.  In a sense, the question of meaning is factored into two parts, structure and value, leaving the aspect of value reduced in complexity and subtlety to the simple question of belonging.  Whether this single bit of meaningful value is enough to encompass all of the dimensions of meaning that we require, and whether it can be compounded to cover the complexity that actually exists in the realm of meaning &mdash; these are questions for an extended future inquiry.
+
\\
 
+
7.
Perhaps I ought to comment on the differences between the present and the standard definition of a formal grammar, since I am attempting to strike a compromise with several alternative conventions of usage, and thus to leave certain options open for future exploration.  All of the changes are minor, in the sense that they are not intended to alter the classes of languages that are able to be generated, but only to clear up various ambiguities and sundry obscurities that affect their conception.
+
& R
 
+
& :>
Primarily, the conventional scope of non-terminal symbols was expanded to encompass the sentence symbol, mainly on account of all the contexts where the initial and the intermediate symbols are naturally invoked in the same breath.  By way of compensating for the usual exclusion of the sentence symbol from the non-terminal class, an equivalent distinction was introduced in the fashion of a distinction between the initial and the intermediate symbols, and this serves its purpose in all of those contexts where the two kind of symbols need to be treated separately.
+
& p_j, \, \text{for each} \, j \in J
 
+
\\
At the present point, I remain a bit worried about the motivations and the justifications for introducing this distinction, under any name, in the first place.  It is purportedly designed to guarantee that the process of derivation at least gets started in a definite direction, while the real questions have to do with how it all ends.  The excuses of efficiency and expediency that I offered as plausible and sufficient reasons for distinguishing between empty and significant sentences are likely to be ephemeral, if not entirely illusory, since intermediate symbols are still permitted to characterize or to cover themselves, not to mention being allowed to cover the empty string, and so the very types of traps that one exerts oneself to avoid at the outset are always there to afflict the process at all of the intervening times.
+
8.
 
+
& R
If one reflects on the form of grammar that is being prescribed here, it looks as if one sought, rather futilely, to avoid the problems of recursion by proscribing the main program from calling itself, while allowing any subprogram to do so.  But any trouble that is avoidable in the part is also avoidable in the main, while any trouble that is inevitable in the part is also inevitable in the main.  Consequently, I am reserving the right to change my mind at a later stage, perhaps to permit the initial symbol to characterize, to cover, to regenerate, or to produce itself, if that turns out to be the best way in the end.
+
& :>
 
+
& R \, \cdot \, R
Before I leave this Subsection, I need to say a few things about the manner in which the abstract theory of formal languages and the pragmatic theory of sign relations interact with each other.
+
\\
 
+
9.
Formal language theory can seem like an awfully picky subject at times, treating every symbol as a thing in itself the way it does, sorting out the nominal types of symbols as objects in themselves, and singling out the passing tokens of symbols as distinct entities in their own rights.  It has to continue doing this, if not for any better reason than to aid in clarifying the kinds of languages that people are accustomed to use, to assist in writing computer programs that are capable of parsing real sentences, and to serve in designing programming languages that people would like to become accustomed to use.  As a matter of fact, the only time that formal language theory becomes too picky, or a bit too myopic in its focus, is when it leads one to think that one is dealing with the thing itself and not just with the sign of it, in other words, when the people who use the tools of formal language theory forget that they are dealing with the mere signs of more interesting objects and not with the objects of ultimate interest in and of themselves.
+
& F
 
+
& :>
As a result, there a number of deleterious effects that can arise from the extreme pickiness of formal language theory, arising, as is often the case, when formal theorists forget the practical context of theorization.  It frequently happens that the exacting task of defining the membership of a formal language leads one to think that this object and this object alone is the justifiable end of the whole exercise.  The distractions of this mediate objective render one liable to forget that one's penultimate interest lies always with various kinds of equivalence classes of signs, not entirely or exclusively with their more meticulous representatives.
+
& ^{\backprime\backprime} \, \operatorname{()} \, ^{\prime\prime}
 
+
\\
When this happens, one typically goes on working oblivious to the fact that many details about what transpires in the meantime do not matter at all in the end, and one is likely to remain in blissful ignorance of the circumstance that many special details of language membership are bound, destined, and pre-determined to be glossed over with some measure of indifference, especially when it comes down to the final constitution of those equivalence classes of signs that are able to answer for the genuine objects of the whole enterprise of language.  When any form of theory, against its initial and its best intentions, leads to this kind of absence of mind that is no longer beneficial in all of its main effects, the situation calls for an antidotal form of theory, one that can restore the presence of mind that all forms of theory are meant to augment.
+
10.
 +
& F
 +
& :>
 +
& ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, T \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}
 +
\\
 +
11.
 +
& T
 +
& :>
 +
& ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime}
 +
\\
 +
12.
 +
& T
 +
& :>
 +
& S'
 +
\\
 +
13.
 +
& T
 +
& :>
 +
& T \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime}
 +
\\
 +
14.
 +
& T
 +
& :>
 +
& T \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, S'
 +
\\
 +
\end{array}</math>
 +
|}
   −
The pragmatic theory of sign relations is called for in settings where everything that can be named has many other names, that is to say, in the usual case.  Of course, one would like to replace this superfluous multiplicity of signs with an organized system of canonical signs, one for each object that needs to be denoted, but reducing the redundancy too far, beyond what is necessary to eliminate the factor of "noise" in the language, that is, to clear up its effectively useless distractions, can destroy the very utility of a typical language, which is intended to provide a ready means to express a present situation, clear or not, and to describe an ongoing condition of experience in just the way that it seems to present itself.  Within this fleshed out framework of language, moreover, the process of transforming the manifestations of a sign from its ordinary appearance to its canonical aspect is the whole problem of computation in a nutshell.
+
<br>
   −
It is a well-known truth, but an often forgotten fact, that nobody computes with numbers, but solely with numerals in respect of numbers, and numerals themselves are symbols.  Among other things, this renders all discussion of numeric versus symbolic computation a bit beside the point, since it is only a question of what kinds of symbols are best for one's immediate application or for one's selection of ongoing objectives.  The numerals that everybody knows best are just the canonical symbols, the standard signs or the normal terms for numbers, and the process of computation is a matter of getting from the arbitrarily obscure signs that the data of a situation are capable of throwing one's way to the indications of its character that are clear enough to motivate action.
+
The preceding development provides a typical example of how an initially effective and conceptually succinct description of a formal language, but one that is terse to the point of allowing its prospective interpreter to waste exorbitant amounts of energy in trying to unravel its implications, can be converted into a form that is more efficient from the operational point of view, even if slightly more ungainly in regard to its elegance.
   −
Having broached the distinction between propositions and sentences, one can see its similarity to the distinction between numbers and numerals.  What are the implications of the foregoing considerations for reasoning about propositions and for the realm of reckonings in sentential logic? If the purpose of a sentence is just to denote a proposition, then the proposition is just the object of whatever sign is taken for a sentence.  This means that the computational manifestation of a piece of reasoning about propositions amounts to a process that takes place entirely within a language of sentences, a procedure that can rationalize its account by referring to the denominations of these sentences among propositions.
+
The basic idea behind all of this machinery remains the same: Besides the select body of formulas that are introduced as boundary conditions, it merely institutes the following general rule:
   −
The application of these considerations in the immediate setting is this:  Do not worry too much about what roles the empty string "" and the blank symbol "&nbsp;" are supposed to play in a given species of formal languages.  As it happens, it is far less important to wonder whether these types of formal tokens actually constitute genuine sentences than it is to decide what equivalence classes it makes sense to form over all of the sentences in the resulting language, and only then to bother about what equivalence classes these limiting cases of sentences are most conveniently taken to represent.
+
{| align="center" cellpadding="8" width="90%"
 
+
|-
These concerns about boundary conditions betray a more general issue.  Already by this point in discussion the limits of the purely syntactic approach to a language are beginning to be visible.  It is not that one cannot go a whole lot further by this road in the analysis of a particular language and in the study of languages in general, but when it comes to the questions of understanding the purpose of a language, of extending its usage in a chosen direction, or of designing a language for a particular set of uses, what matters above all else are the ''pragmatic equivalence classes'' of signs that are demanded by the application and intended by the designer, and not so much the peculiar characters of the signs that represent these classes of practical meaning.
+
| <math>\operatorname{If}</math>
 
+
| the strings <math>S_1, \ldots, S_k\!</math> are sentences,
Any description of a language is bound to have alternative descriptions.  More precisely, a circumscribed description of a formal language, as any effectively finite description is bound to be, is certain to suggest the equally likely existence and the possible utility of other descriptions.  A single formal grammar describes but a single formal language, but any formal language is described by many different formal grammars, not all of which afford the same grasp of its structure, provide an equivalent comprehension of its character, or yield an interchangeable view of its aspects.  Consequently, even with respect to the same formal language, different formal grammars are typically better for different purposes.
+
|-
 
+
| <math>\operatorname{Then}</math>
With the distinctions that evolve among the different styles of grammar, and with the preferences that different observers display toward them, there naturally comes the question:  What is the root of this evolution?
+
| their concatenation in the form
 +
|-
 +
| &nbsp;
 +
| <math>\operatorname{Conc}_{j=1}^k S_j \ = \ S_1 \, \cdot \, \ldots \, \cdot \, S_k</math>
 +
|-
 +
| &nbsp;
 +
| is a sentence,
 +
|-
 +
| <math>\operatorname{And}</math>
 +
| their surcatenation in the form
 +
|-
 +
| &nbsp;
 +
| <math>\operatorname{Surc}_{j=1}^k S_j \ = \ ^{\backprime\backprime} \, \operatorname{(} \, ^{\prime\prime} \, \cdot \, S_1 \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, \ldots \, \cdot \, ^{\backprime\backprime} \, \operatorname{,} \, ^{\prime\prime} \, \cdot \, S_k \, \cdot \, ^{\backprime\backprime} \, \operatorname{)} \, ^{\prime\prime}</math>
 +
|-
 +
| &nbsp;
 +
| is a sentence.
 +
|}
   −
One dimension of variation in the styles of formal grammars can be seen by treating the union of languages, and especially the disjoint union of languages, as a ''sum'', by treating the concatenation of languages as a ''product'', and then by distinguishing the styles of analysis that favor ''sums of products'' from those that favor ''products of sums'' as their canonical forms of description.  If one examines the relation between languages and grammars carefully enough to see the presence and the influence of these different styles, and when one comes to appreciate the ways that different styles of grammars can be used with different degrees of success for different purposes, then one begins to see the possibility that alternative styles of description can be based on altogether different linguistic and logical operations.
+
===Generalities About Formal Grammars===
   −
<pre>
+
It is fitting to wrap up the foregoing developments by summarizing the notion of a formal grammar that appeared to evolve in the present caseFor the sake of future reference and the chance of a wider application, it is also useful to try to extract the scheme of a formalization that potentially holds for any formal languageThe following presentation of the notion of a formal grammar is adapted, with minor modifications, from the treatment in (DDQ, 60&ndash;61).
It possible to trace this divergence of styles to an even more primitive
  −
division, one that distinguishes the "additive" or the "parallel" styles
  −
from the "multiplicative" or the "serial" stylesThe issue is somewhat
  −
confused by the fact that an "additive" analysis is typically expressed
  −
in the form of a "series", in other words, a disjoint union of sets or a
  −
linear sum of their independent effects.  But it is easy enough to sort
  −
this out if one observes the more telling connection between "parallel"
  −
and "independent"Another way to keep the right associations straight
  −
is to employ the term "sequential" in preference to the more misleading
  −
term "serial". Whatever one calls this broad division of styles, the
  −
scope and sweep of their dimensions of variation can be delineated in
  −
the following way:
     −
1.  The "additive" or "parallel" styles favor "sums of products" as
+
A ''formal grammar'' <math>\mathfrak{G}</math> is given by a four-tuple <math>\mathfrak{G} = ( \, ^{\backprime\backprime} S \, ^{\prime\prime}, \, \mathfrak{Q}, \, \mathfrak{A}, \, \mathfrak{K} \, )</math> that takes the following form of description:
    canonical forms of expression, pulling sums, unions, co-products,
  −
    and logical disjunctions to the outermost layers of analysis and
  −
    synthesis, while pushing products, intersections, concatenations,
  −
    and logical conjunctions to the innermost levels of articulation
  −
    and generation.  In propositional logic, this style leads to the
  −
    "disjunctive normal form" (DNF).
     −
2.  The "multiplicative" or "serial" styles favor "products of sums"
+
<ol style="list-style-type:decimal">
    as canonical forms of expression, pulling products, intersections,
  −
    concatenations, and logical conjunctions to the outermost layers of
  −
    analysis and synthesis, while pushing sums, unions, co-products,
  −
    and logical disjunctions to the innermost levels of articulation
  −
    and generation.  In propositional logic, this style leads to the
  −
    "conjunctive normal form" (CNF).
     −
There is a curious sort of diagnostic clue, a veritable shibboleth,
+
<li><math>^{\backprime\backprime} S \, ^{\prime\prime}</math> is the ''initial'', ''special'', ''start'', or ''sentence'' symbolSince the letter <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> serves this function only in a special setting, its employment in this role need not create any confusion with its other typical uses as a string variable or as a sentence variable.</li>
that often serves to reveal the dominance of one mode or the other
  −
within an individual thinker's cognitive styleExamined on the
  −
question of what constitutes the "natural numbers", an "additive"
  −
thinker tends to start the sequence at 0, while a "multiplicative"
  −
thinker tends to regard it as beginning at 1.
     −
In any style of description, grammar, or theory of a language, it is
+
<li><math>\mathfrak{Q} = \{ q_1, \ldots, q_m \}</math> is a finite set of ''intermediate symbols'', all distinct from <math>^{\backprime\backprime} S \, ^{\prime\prime}.</math></li>
usually possible to tease out the influence of these contrasting traits,
  −
namely, the "additive" attitude versus the "mutiplicative" tendency that
  −
go to make up the particular style in question, and even to determine the
  −
dominant inclination or point of view that establishes its perspective on
  −
the target domain.
     −
In each style of formal grammar, the "multiplicative" aspect is present
+
<li><math>\mathfrak{A} = \{ a_1, \dots, a_n \}</math> is a finite set of ''terminal symbols'', also known as the ''alphabet'' of <math>\mathfrak{G},</math> all distinct from <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> and disjoint from <math>\mathfrak{Q}.</math> Depending on the particular conception of the language <math>\mathfrak{L}</math> that is ''covered'', ''generated'', ''governed'', or ''ruled'' by the grammar <math>\mathfrak{G},</math> that is, whether <math>\mathfrak{L}</math> is conceived to be a set of words, sentences, paragraphs, or more extended structures of discourse, it is usual to describe <math>\mathfrak{A}</math> as the ''alphabet'', ''lexicon'', ''vocabulary'', ''liturgy'', or ''phrase book'' of both the grammar <math>\mathfrak{G}</math> and the language <math>\mathfrak{L}</math> that it regulates.</li>
in the sequential concatenation of signs, both in the augmented strings
  −
and in the terminal stringsIn settings where the non-terminal symbols
  −
classify types of strings, the concatenation of the non-terminal symbols
  −
signifies the cartesian product over the corresponding sets of strings.
     −
In the context-free style of formal grammar, the "additive" aspect is
+
<li><math>\mathfrak{K}</math> is a finite set of ''characterizations''Depending on how they come into play, these are variously described as ''covering rules'', ''formations'', ''productions'', ''rewrite rules'', ''subsumptions'', ''transformations'', or ''typing rules''.</li>
easy enough to spotIt is signaled by the parallel covering of many
  −
augmented strings or sentential forms by the same non-terminal symbol.
  −
Expressed in active terms, this calls for the independent rewriting
  −
of that non-terminal symbol by a number of different successors,
  −
as in the following scheme:
     −
| q    :>   W_1.
+
</ol>
|
  −
| ...  ...  ...
  −
|
  −
| q    :>    W_k.
     −
It is useful to examine the relationship between the grammatical covering
+
To describe the elements of <math>\mathfrak{K}</math> it helps to define some additional terms:
or production relation ":>" and the logical relation of implication "=>",
  −
with one eye to what they have in common and one eye to how they differ.
  −
The production "q :> W" says that the appearance of the symbol "q" in
  −
a sentential form implies the possibility of exchanging it for "W".
  −
Although this sounds like a "possible implication", to the extent
  −
that "q implies a possible W" or that "q possibly implies W", the
  −
qualifiers "possible" and "possibly" are the critical elements in
  −
these statements, and they are crucial to the meaning of what is
  −
actually being implied.  In effect, these qualifications reverse
  −
the direction of implication, yielding "q <= W" as the best
  −
analogue for the sense of the production.
     −
One way to sum this up is to say that non-terminal symbols have the
+
<ol style="list-style-type:lower-latin">
significance of hypotheses.  The terminal strings form the empirical
  −
matter of a language, while the non-terminal symbols mark the patterns
  −
or the types of substrings that can be noticed in the profusion of data.
  −
If one observes a portion of a terminal string that falls into the pattern
  −
of the sentential form W, then it is an admissable hypothesis, according to
  −
the theory of the language that is constituted by the formal grammar, that
  −
this piece not only fits the type q but even comes to be generated under
  −
the auspices of the non-terminal symbol "q".
     −
A moment's reflection on the issue of style, giving due consideration to the
+
<li>The symbols in <math>\{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q} \cup \mathfrak{A}</math> form the ''augmented alphabet'' of <math>\mathfrak{G}.</math></li>
received array of stylistic choices, ought to inspire at least the question:
  −
"Are these the only choices there are?"  In the present setting, there are
  −
abundant indications that other options, more differentiated varieties of
  −
description and more integrated ways of approaching individual languages,
  −
are likely to be conceivable, feasible, and even more ultimately viable.
  −
If a suitably generic style, one that incorporates the full scope of
  −
logical combinations and operations, is broadly available, then it
  −
would no longer be necessary, or even apt, to argue in universal
  −
terms about "which style is best", but more useful to investigate
  −
how we might adapt the local styles to the local requirements.
  −
The medium of a generic style would yield a viable compromise
  −
between "additive" and "multiplicative" canons, and render the
  −
choice between "parallel" and "serial" a false alternative,
  −
at least, when expressed in the globally exclusive terms
  −
that are currently most commonly adopted to pose it.
     −
One set of indications comes from the study of machines, languages, and
+
<li>The symbols in <math>\{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q}</math> are the ''non-terminal symbols'' of <math>\mathfrak{G}.</math></li>
computation, especially the theories of their structures and relations.
  −
The forms of composition and decomposition that are generally known as
  −
"parallel" and "serial" are merely the extreme special cases, in variant
  −
directions of specialization, of a more generic form, usually called the
  −
"cascade" form of combination.  This is a well-known fact in the theories
  −
that deal with automata and their associated formal languages, but its
  −
implications do not seem to be widely appreciated outside these fields.
  −
In particular, it dispells the need to choose one extreme or the other,
  −
since most of the natural cases are likely to exist somewhere in between.
     −
Another set of indications appears in algebra and category theory,
+
<li>The symbols in <math>\mathfrak{Q} \cup \mathfrak{A}</math> are the ''non-initial symbols'' of <math>\mathfrak{G}.</math></li>
where forms of composition and decomposition related to the cascade
  −
combination, namely, the "semi-direct product" and its special case,
  −
the "wreath product", are encountered at higher levels of generality
  −
than the cartesian products of sets or the direct products of spaces.
     −
In these domains of operation, one finds it necessary to consider also
+
<li>The strings in <math>( \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q} \cup \mathfrak{A} )^*</math> are the ''augmented strings'' for <math>\mathfrak{G}.</math></li>
the "co-product" of sets and spaces, a construction that artificially
  −
creates a disjoint union of sets, that is, a union of spaces that are
  −
being treated as independent. It does this, in effect, by "indexing",
  −
"coloring", or "preparing" the otherwise possibly overlapping domains
  −
that are being combined.  What renders this a "chimera" or a "hybrid"
  −
form of combination is the fact that this indexing is tantamount to a
  −
cartesian product of a singleton set, namely, the conventional "index",
  −
"color", or "affix" in question, with the individual domain that is
  −
entering as a factor, a term, or a participant in the final result.
     −
One of the insights that arises out of Peirce's logical work is that
+
<li>The strings in <math>\{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup (\mathfrak{Q} \cup \mathfrak{A})^*</math> are the ''sentential forms'' for <math>\mathfrak{G}.</math></li>
the set operations of complementation, intersection, and union, along
  −
with the logical operations of negation, conjunction, and disjunction
  −
that operate in isomorphic tandem with them, are not as fundamental as
  −
they first appear.  This is because all of them can be constructed from
  −
or derived from a smaller set of operations, in fact, taking the logical
  −
side of things, from either one of two "solely sufficient" operators,
  −
called "amphecks" by Peirce, "strokes" by those who re-discovered them
  −
later, and known in computer science as the NAND and the NNOR operators.
  −
For this reason, that is, by virtue of their precedence in the orders
  −
of construction and derivation, these operations have to be regarded
  −
as the simplest and the most primitive in principle, even if they are
  −
scarcely recognized as lying among the more familiar elements of logic.
     −
I am throwing together a wide variety of different operations into each
+
</ol>
of the bins labeled "additive" and "multiplicative", but it is easy to
  −
observe a natural organization and even some relations approaching
  −
isomorphisms among and between the members of each class.
     −
The relation between logical disjunction and set-theoretic union and the
+
Each characterization in <math>\mathfrak{K}</math> is an ordered pair of strings <math>(S_1, S_2)\!</math> that takes the following form:
relation between logical conjunction and set-theoretic intersection ought
  −
to be clear enough for the purposes of the immediately present context.
  −
In any case, all of these relations are scheduled to receive a thorough
  −
examination in a subsequent discussion (Subsection 1.3.10.13).  But the
  −
relation of a set-theoretic union to a category-theoretic co-product and
  −
the relation of a set-theoretic intersection to a syntactic concatenation
  −
deserve a closer look at this point.
     −
The effect of a co-product as a "disjointed union", in other words, that
+
{| align="center" cellpadding="8" width="90%"
creates an object tantamount to a disjoint union of sets in the resulting
+
| <math>S_1 \ = \ Q_1 \cdot q \cdot Q_2,</math>
co-product even if some of these sets intersect non-trivially and even if
+
|-
some of them are identical "in reality", can be achieved in several ways.
+
| <math>S_2 \ = \ Q_1 \cdot W \cdot Q_2.</math>
The most usual conception is that of making a "separate copy", for each
+
|}
part of the intended co-product, of the set that is intended to go there.
  −
Often one thinks of the set that is assigned to a particular part of the
  −
co-product as being distinguished by a particular "color", in other words,
  −
by the attachment of a distinct "index", "label", or "tag", being a marker
  −
that is inherited by and passed on to every element of the set in that part.
  −
A concrete image of this construction can be achieved by imagining that each
  −
set and each element of each set is placed in an ordered pair with the sign
  −
of its color, index, label, or tag.  One describes this as the "injection"
  −
of each set into the corresponding "part" of the co-product.
     −
For example, given the sets P and Q, overlapping or not, one can define
+
In this scheme, <math>S_1\!</math> and <math>S_2\!</math> are members of the augmented strings for <math>\mathfrak{G},</math> more precisely, <math>S_1\!</math> is a non-empty string and a sentential form over <math>\mathfrak{G},</math> while <math>S_2\!</math> is a possibly empty string and also a sentential form over <math>\mathfrak{G}.</math>
the "indexed" sets or the "marked" sets P_[1] and Q_[2], amounting to the
  −
copy of P into the first part of the co-product and the copy of Q into the
  −
second part of the co-product, in the following manner:
     −
P_[1]  =  <P, 1> {<x, 1> :  x in P},
+
Here also, <math>q\!</math> is a non-terminal symbol, that is, <math>q \in \{ \, ^{\backprime\backprime} S \, ^{\prime\prime} \, \} \cup \mathfrak{Q},</math> while <math>Q_1, Q_2,\!</math> and <math>W\!</math> are possibly empty strings of non-initial symbols, a fact that can be expressed in the form, <math>Q_1, Q_2, W \in (\mathfrak{Q} \cup \mathfrak{A})^*.</math>
   −
Q_[2]  =  <Q, 2> =  {<x, 2> :  x in Q}.
+
In practice, the couplets in <math>\mathfrak{K}</math> are used to ''derive'', to ''generate'', or to ''produce'' sentences of the corresponding language <math>\mathfrak{L} = \mathfrak{L} (\mathfrak{G}).</math> The language <math>\mathfrak{L}</math> is then said to be ''governed'', ''licensed'', or ''regulated'' by the grammar <math>\mathfrak{G},</math> a circumstance that is expressed in the form <math>\mathfrak{L} = \langle \mathfrak{G} \rangle.</math>  In order to facilitate this active employment of the grammar, it is conventional to write the abstract characterization <math>(S_1, S_2)\!</math> and the specific characterization <math>(Q_1 \cdot q \cdot Q_2, \ Q_1 \cdot W \cdot Q_2)</math> in the following forms, respectively:
   −
Using the sign "]_[" for this construction, the "sum", the "co-product",
+
{| align="center" cellpadding="8" width="90%"
or the "disjointed union" of P and Q in that order can be represented as
+
|
the ordinary disjoint union of P_[1] and Q_[2].
+
<math>\begin{array}{lll}
 +
S_1
 +
& :>
 +
& S_2
 +
\\
 +
Q_1 \cdot q \cdot Q_2
 +
& :>
 +
& Q_1 \cdot W \cdot Q_2
 +
\\
 +
\end{array}</math>
 +
|}
   −
P ]_[ Q  =  P_[1] |_| Q_[2].
+
In this usage, the characterization <math>S_1 :> S_2\!</math> is tantamount to a grammatical license to transform a string of the form <math>Q_1 \cdot q \cdot Q_2</math> into a string of the form <math>Q1 \cdot W \cdot Q2,</math> in effect, replacing the non-terminal symbol <math>q\!</math> with the non-initial string <math>W\!</math> in any selected, preserved, and closely adjoining context of the form <math>Q1 \cdot \underline{[[User:Jon Awbrey|Jon Awbrey]] ([[User talk:Jon Awbrey|talk]])} \cdot Q2.</math>  In this application the notation <math>S_1 :> S_2\!</math> can be read to say that <math>S_1\!</math> ''produces'' <math>S_2\!</math> or that <math>S_1\!</math> ''transforms into'' <math>S_2.\!</math>
   −
The concatenation L_1 · L_2 of the formal languages L_1 and L_2 is
+
An ''immediate derivation'' in <math>\mathfrak{G}\!</math> is an ordered pair <math>(W, W^\prime)\!</math> of sentential forms in <math>\mathfrak{G}\!</math> such that:
just the cartesian product of sets L_1 x L_2 without the extra x's,
  −
but the relation of cartesian products to set-theoretic intersections
  −
and thus to logical conjunctions is far from being clear.  One way of
  −
seeing a type of relation is to focus on the information that is needed
  −
to specify each construction, and thus to reflect on the signs that are
  −
used to carry this information.  As a first approach to the topic of
  −
information, according to a strategy that seeks to be as elementary
  −
and as informal as possible, I introduce the following set of ideas,
  −
intended to be taken in a very provisional way.
     −
A "stricture" is a specification of a certain set in a certain place,
+
{| align="center" cellpadding="8" width="90%"
relative to a number of other sets, yet to be specified.  It is assumed
+
|
that one knows enough to tell if two strictures are equivalent as pieces
+
<math>\begin{array}{llll}
of information, but any more determinate indications, like names for the
+
W = Q_1 \cdot X \cdot Q_2,
places that are mentioned in the stricture, or bounds on the number of
+
& W' = Q_1 \cdot Y \cdot Q_2,
places that are involved, are regarded as being extraneous impositions,
+
& \text{and}
outside the proper concern of the definition, no matter how convenient
+
& (X, Y) \in \mathfrak{K}.
they are found to be for a particular discussion.  As a schematic form
+
\end{array}</math>
of illustration, a stricture can be pictured in the following shape:
+
|}
   −
"... x X x Q x X x ..."
+
As noted above, it is usual to express the condition <math>(X, Y) \in \mathfrak{K}</math> by writing <math>X :> Y \, \text{in} \, \mathfrak{G}.</math>
   −
A "strait" is the object that is specified by a stricture, in effect,
+
The immediate derivation relation is indicated by saying that <math>W\!</math> ''immediately derives'' <math>W',\!</math> by saying that <math>W'\!</math> is ''immediately derived'' from <math>W\!</math> in <math>\mathfrak{G},</math> and also by writing:
a certain set in a certain place of an otherwise yet to be specified
  −
relation.  Somewhat sketchily, the strait that corresponds to the
  −
stricture just given can be pictured in the following shape:
     −
... x X x Q x X x ...
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>W ::> W'.\!</math>
 +
|}
   −
In this picture, Q is a certain set, and X is the universe of discourse
+
A ''derivation'' in <math>\mathfrak{G}</math> is a finite sequence <math>(W_1, \ldots, W_k)\!</math> of sentential forms over <math>\mathfrak{G}</math> such that each adjacent pair <math>(W_j, W_{j+1})\!</math> of sentential forms in the sequence is an immediate derivation in <math>\mathfrak{G},</math> in other words, such that:
that is relevant to a given discussion.  Since a stricture does not, by
  −
itself, contain a sufficient amount of information to specify the number
  −
of sets that it intends to set in place, or even to specify the absolute
  −
location of the set that its does set in place, it appears to place an
  −
unspecified number of unspecified sets in a vague and uncertain strait.
  −
Taken out of its interpretive context, the residual information that a
  −
stricture can convey makes all of the following potentially equivalent
  −
as strictures:
     −
"Q""XxQxX""XxXxQxXxX",   ...
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>W_j ::> W_{j+1},\ \text{for all}\ j = 1\ \text{to}\ k - 1.</math>
 +
|}
   −
With respect to what these strictures specify, this
+
If there exists a derivation <math>(W_1, \ldots, W_k)\!</math> in <math>\mathfrak{G},</math> one says that <math>W_1\!</math> ''derives'' <math>W_k\!</math> in <math>\mathfrak{G}</math> or that <math>W_k\!</math> is ''derivable'' from <math>W_1\!</math> in <math>\mathfrak{G},</math> and one
leaves all of the following equivalent as straits:
+
typically summarizes the derivation by writing:
   −
= XxQxX  = XxXxQxXxX  = ...
+
{| align="center" cellpadding="8" width="90%"
 +
| <math>W_1 :\!*\!:> W_k.\!</math>
 +
|}
   −
Within the framework of a particular discussion, it is customary to
+
The language <math>\mathfrak{L} = \mathfrak{L} (\mathfrak{G}) = \langle \mathfrak{G} \rangle</math> that is ''generated'' by the formal grammar <math>\mathfrak{G} = ( \, ^{\backprime\backprime} S \, ^{\prime\prime}, \, \mathfrak{Q}, \, \mathfrak{A}, \, \mathfrak{K} \, )</math> is the set of strings over the terminal alphabet <math>\mathfrak{A}</math> that are derivable from the initial symbol <math>^{\backprime\backprime} S \, ^{\prime\prime}</math> by way of the intermediate symbols in <math>\mathfrak{Q}</math> according to the characterizations in <math>\mathfrak{K}.</math> In sum:
set a bound on the number of places and to limit the variety of sets
  −
that are regarded as being under active consideration, and it is also
  −
convenient to index the places of the indicated relations, and of their
  −
encompassing cartesian products, in some fixed way.  But the whole idea
  −
of a stricture is to specify a strait that is capable of extending through
  −
and beyond any fixed frame of discussion.  In other words, a stricture is
  −
conceived to constrain a strait at a certain point, and then to leave it
  −
literally embedded, if tacitly expressed, in a yet to be fully specified
  −
relation, one that involves an unspecified number of unspecified domains.
     −
A quantity of information is a measure of constraint.  In this respect,
+
{| align="center" cellpadding="8" width="90%"
a set of comparable strictures is ordered on account of the information
+
| <math>\mathfrak{L} (\mathfrak{G}) \ = \ \langle \mathfrak{G} \rangle \ = \ \{ \, W \in \mathfrak{A}^* \, : \, ^{\backprime\backprime} S \, ^{\prime\prime} \, :\!*\!:> \, W \, \}.</math>
that each one conveys, and a system of comparable straits is ordered in
+
|}
accord with the amount of information that it takes to pin each one of
  −
them down.  Strictures that are more constraining and straits that are
  −
more constrained are placed at higher levels of information than those
  −
that are less so, and entities that involve more information are said
  −
to have a greater "complexity" in comparison with those entities that
  −
involve less information, that are said to have a greater "simplicity".
  −
 
  −
In order to create a concrete example, let me now institute a frame of
  −
discussion where the number of places in a relation is bounded at two,
  −
and where the variety of sets under active consideration is limited to
  −
the typical subsets P and Q of a universe X. Under these conditions,
  −
one can use the following sorts of expression as schematic strictures:
     −
|  "X" ,   "P" ,   "Q" ,
+
Finally, a string <math>W\!</math> is called a ''word'', a ''sentence'', or so on, of the language generated by <math>\mathfrak{G}</math> if and only if <math>W\!</math> is in <math>\mathfrak{L} (\mathfrak{G}).</math>
|
  −
| "XxX", "XxP",  "XxQ",
  −
|
  −
| "PxX",  "PxP",  "PxQ",
  −
|
  −
| "QxX",  "QxP",  "QxQ".
     −
These strictures and their corresponding straits are stratified according
+
===The Cactus Language : Stylistics===
to their amounts of information, or their levels of constraint, as follows:
     −
| High:  "PxP""PxQ""QxP""QxQ".
+
{| align="center" cellpadding="0" cellspacing="0" width="90%"
 
|
 
|
| Med:    "P" "XxP", "PxX".
+
<p>As a result, we can hardly conceive of how many possibilities there are for what we call objective reality. Our sharp quills of knowledge are so narrow and so concentrated in particular directions that with science there are myriads of totally different real worlds, each one accessible from the next simply by slight alterations &mdash; shifts of gaze &mdash; of every particular discipline and subspecialty.
|
+
</p>
| Med:    "Q" , "XxQ", "QxX".
+
|-
|
+
| align="right" | &mdash; Herbert J. Bernstein, "Idols of Modern Science", [HJB, 38]
| Low:    "X" ,  "XxX".
+
|}
   −
Within this framework, the more complex strait PxQ can be expressed
+
This Subsection highlights an issue of ''style'' that arises in describing a formal language.  In broad terms, I use the word ''style'' to refer to a loosely specified class of formal systems, typically ones that have a set of distinctive features in common.  For instance, a style of proof system usually dictates one or more rules of inference that are acknowledged as conforming to that styleIn the present context, the word ''style'' is a natural choice to characterize the varieties of formal grammars, or any other sorts of formal systems that can be contemplated for deriving the sentences of a formal language.
in terms of the simpler straits, PxX and XxQMore specifically,
  −
it lends itself to being analyzed as their intersection, in the
  −
following way:
     −
PxQ = PxX |^| XxQ.
+
In looking at what seems like an incidental issue, the discussion arrives at a critical point. The question is: What decides the issue of style?  Taking a given language as the object of discussion, what factors enter into and determine the choice of a style for its presentation, that is, a particular way of arranging and selecting the materials that come to be involved in a description, a grammar, or a theory of the language?  To what degree is the determination accidental, empirical, pragmatic, rhetorical, or stylistic, and to what extent is the choice essential, logical, and necessary?  For that matter, what determines the order of signs in a word, a sentence, a text, or a discussion?  All of the corresponding parallel questions about the character of this choice can be posed with regard to the constituent part as well as with regard to the main constitution of the formal language.
   −
>From here it is easy to see the relation of concatenation, by virtue of
+
In order to answer this sort of question, at any level of articulation, one has to inquire into the type of distinction that it invokes, between arrangements and orders that are essential, logical, and necessary and orders and arrangements that are accidental, rhetorical, and stylistic.  As a rough guide to its comprehension, a ''logical order'', if it resides in the subject at all, can be approached by considering all of the ways of saying the same things, in all of the languages that are capable of saying roughly the same things about that subject. Of course, the ''all'' that appears in this rule of thumb has to be interpreted as a fittingly qualified sort of universal.  For all practical purposes, it simply means ''all of the ways that a person can think of'' and ''all of the languages that a person can conceive of'', with all things being relative to the particular moment of investigation.  For all of these reasons, the rule must stand as little more than a rough idea of how to approach its object.
these types of intersection, to the logical conjunction of propositions.
  −
The cartesian product PxQ is described by a conjunction of propositions,
  −
namely, "P_<1> and Q_<2>", subject to the following interpretation:
     −
1"P_<1>" asserts that there is an element from
+
If it is demonstrated that a given formal language can be presented in any one of several styles of formal grammar, then the choice of a format is accidental, optional, and stylistic to the very extent that it is freeBut if it can be shown that a particular language cannot be successfully presented in a particular style of grammar, then the issue of style is no longer free and rhetorical, but becomes to that very degree essential, necessary, and obligatory, in other words, a question of the objective logical order that can be found to reside in the object language.
    the set P in the first place of the product.
     −
2.  "Q_<2>" asserts that there is an element from
+
As a rough illustration of the difference between logical and rhetorical orders, consider the kinds of order that are expressed and exhibited in the following conjunction of implications:
    the set Q in the second place of the product.
     −
The integration of these two pieces of information can be taken
+
{| align="center" cellpadding="8" width="90%"
in that measure to specify a yet to be fully determined relation.
+
| <math>X \Rightarrow Y\ \operatorname{and}\ Y \Rightarrow Z.</math>
 +
|}
   −
In a corresponding fashion at the level of the elements,
+
Here, there is a happy conformity between the logical content and the rhetorical form, indeed, to such a degree that one hardly notices the difference between them.  The rhetorical form is given by the order of sentences in the two implications and the order of implications in the conjunction.  The logical content is given by the order of propositions in the extended implicational sequence:
the ordered pair <p, q> is described by a conjunction
  −
of propositions, namely, "p_<1> and q_<2>", subject
  −
to the following interpretation:
     −
1. "p_<1>" says that p is in the first place
+
{| align="center" cellpadding="8" width="90%"
    of the product element under construction.
+
| <math>X\ \le\ Y\ \le\ Z.</math>
 +
|}
   −
2.  "q_<2>" says that q is in the second place
+
To see the difference between form and content, or manner and matter, it is enough to observe a few of the ways that the expression can be varied without changing its meaning, for example:
    of the product element under construction.
     −
Notice that, in construing the cartesian product of the sets P and Q or the
+
{| align="center" cellpadding="8" width="90%"
concatenation of the languages L_1 and L_2 in this way, one shifts the level
+
| <math>Z \Leftarrow Y\ \operatorname{and}\ Y \Leftarrow X.</math>
of the active construction from the tupling of the elements in P and Q or the
+
|}
concatenation of the strings that are internal to the languages L_1 and L_2 to
  −
the concatenation of the external signs that it takes to indicate these sets or
  −
these languages, in other words, passing to a conjunction of indexed propositions,
  −
"P_<1> and Q_<2>", or to a conjunction of assertions, "L_1_<1> and L_2_<2>", that
  −
marks the sets or the languages in question for insertion in the indicated places
  −
of a product set or a product language, respectively.  In effect, the subscripting
  −
by the indices "<1>" and "<2>" can be recognized as a special case of concatenation,
  −
albeit through the posting of editorial remarks from an external "mark-up" language.
     −
In order to systematize the relations that strictures and straits placed
+
Any style of declarative programming, also called ''logic programming'', depends on a capacity, as embodied in a programming language or other formal system, to describe the relation between problems and solutions in logical terms.  A recurring problem in building this capacity is in bridging the gap between ostensibly non-logical orders and the logical orders that are used to describe and to represent them.  For instance, to mention just a couple of the most pressing cases, and the ones that are currently proving to be the most resistant to a complete analysis, one has the orders of dynamic evolution and rhetorical transition that manifest themselves in the process of inquiry and in the communication of its results.
at higher levels of complexity, constraint, information, and organization
  −
have with those that are placed at the associated lower levels, I introduce
  −
the following pair of definitions:
     −
The j^th "excerpt" of a stricture of the form "S_1 x ... x S_k", regarded
+
This patch of the ongoing discussion is concerned with describing a particular variety of formal languages, whose typical representative is the painted cactus language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P}).\!</math>  It is the intention of this work to interpret this language for propositional logic, and thus to use it as a sentential calculus, an order of reasoning that forms an active ingredient and a significant component of all logical reasoning.  To describe this language, the standard devices of formal grammars and formal language theory are more than adequate, but this only raises the next question:  What sorts of devices are exactly adequate, and fit the task to a "T"? The ultimate desire is to turn the tables on the order of description, and so begins a process of eversion that evolves to the point of asking:  To what extent can the language capture the essential features and laws of its own grammar and describe the active principles of its own generation?  In other words:  How well can the language be described by using the language itself to do so?
within a frame of discussion where the number of places is limited to k,
  −
is the stricture of the form "X x ... x S_j x ... x X". In the proper
  −
context, this can be written more succinctly as the stricture "S_j_<j>",
  −
an assertion that places the j^th set in the j^th place of the product.
     −
The j^th "extract" of a strait of the form S_1 x ... x S_k, constrained
+
In order to speak to these questions, I have to express what a grammar says about a language in terms of what a language can say on its own. In effect, it is necessary to analyze the kinds of meaningful statements that grammars are capable of making about languages in general and to relate them to the kinds of meaningful statements that the syntactic ''sentences'' of the cactus language might be interpreted as making about the very same topicsSo far in the present discussion, the sentences of the cactus language do not make any meaningful statements at all, much less any meaningful statements about languages and their constitutions.  As of yet, these sentences subsist in the form of purely abstract, formal, and uninterpreted combinatorial constructions.
to a frame of discussion where the number of places is restricted to k,
  −
is the strait of the form X x ... x S_j x ... x XIn the appropriate
  −
context, this can be denoted more succinctly by the stricture "S_j_<j>",
  −
an assertion that places the j^th set in the j^th place of the product.
     −
In these terms, a stricture of the form "S_1 x ... x S_k"
+
Before the capacity of a language to describe itself can be evaluated, the missing link to meaning has to be supplied for each of its strings.  This calls for a dimension of semantics and a notion of interpretation, topics that are taken up for the case of the cactus language <math>\mathfrak{C} (\mathfrak{P})</math> in Subsection 1.3.10.12.  Once a plausible semantics is prescribed for this language it will be possible to return to these questions and to address them in a meaningful way.
can be expressed in terms of simpler strictures, to wit,
  −
as a conjunction of its k excerpts:
     −
"S_1 x ... x S_k"  =  "S_1_<1>" & ...  & "S_k_<k>".
+
The prominent issue at this point is the distinct placements of formal languages and formal grammars with respect to the question of meaning. The sentences of a formal language are merely the abstract strings of abstract signs that happen to belong to a certain set. They do not by themselves make any meaningful statements at all, not without mounting a separate effort of interpretation, but the rules of a formal grammar make meaningful statements about a formal language, to the extent that they say what strings belong to it and what strings do notThus, the formal grammar, a formalism that appears to be even more skeletal than the formal language, still has bits and pieces of meaning attached to it. In a sense, the question of meaning is factored into two parts, structure and value, leaving the aspect of value reduced in complexity and subtlety to the simple question of belongingWhether this single bit of meaningful value is enough to encompass all of the dimensions of meaning that we require, and whether it can be compounded to cover the complexity that actually exists in the realm of meaning &mdash; these are questions for an extended future inquiry.
   −
In a similar vein, a strait of the form S_1 x ... x S_k
+
Perhaps I ought to comment on the differences between the present and the standard definition of a formal grammar, since I am attempting to strike a compromise with several alternative conventions of usage, and thus to leave certain options open for future exploration.  All of the changes are minor, in the sense that they are not intended to alter the classes of languages that are able to be generated, but only to clear up various ambiguities and sundry obscurities that affect their conception.
can be expressed in terms of simpler straits, namely,
  −
as an intersection of its k extracts:
     −
  S_1 x ... x S_k    =    S_1_<1> |^| ... |^| S_k_<k>.
+
Primarily, the conventional scope of non-terminal symbols was expanded to encompass the sentence symbol, mainly on account of all the contexts where the initial and the intermediate symbols are naturally invoked in the same breath. By way of compensating for the usual exclusion of the sentence symbol from the non-terminal class, an equivalent distinction was introduced in the fashion of a distinction between the initial and the intermediate symbols, and this serves its purpose in all of those contexts where the two kind of symbols need to be treated separately.
   −
There is a measure of ambiguity that remains in this formulation,
+
At the present point, I remain a bit worried about the motivations and the justifications for introducing this distinction, under any name, in the first place.  It is purportedly designed to guarantee that the process of derivation at least gets started in a definite direction, while the real questions have to do with how it all ends.  The excuses of efficiency and expediency that I offered as plausible and sufficient reasons for distinguishing between empty and significant sentences are likely to be ephemeral, if not entirely illusory, since intermediate symbols are still permitted to characterize or to cover themselves, not to mention being allowed to cover the empty string, and so the very types of traps that one exerts oneself to avoid at the outset are always there to afflict the process at all of the intervening times.
but it is the best that I can do in the present informal context.
  −
</pre>
     −
==The Cactus Language : Mechanics==
+
If one reflects on the form of grammar that is being prescribed here, it looks as if one sought, rather futilely, to avoid the problems of recursion by proscribing the main program from calling itself, while allowing any subprogram to do so.  But any trouble that is avoidable in the part is also avoidable in the main, while any trouble that is inevitable in the part is also inevitable in the main.  Consequently, I am reserving the right to change my mind at a later stage, perhaps to permit the initial symbol to characterize, to cover, to regenerate, or to produce itself, if that turns out to be the best way in the end.
   −
{| align="center" cellpadding="0" cellspacing="0" width="90%"
+
Before I leave this Subsection, I need to say a few things about the manner in which the abstract theory of formal languages and the pragmatic theory of sign relations interact with each other.
|
  −
<p>We are only now beginning to see how this works.  Clearly one of the mechanisms for picking a reality is the sociohistorical sense of what is important &mdash; which research program, with all its particularity of knowledge, seems most fundamental, most productive, most penetrating.  The very judgments which make us push narrowly forward simultaneously make us forget how little we know.  And when we look back at history, where the lesson is plain to find, we often fail to imagine ourselves in a parallel situation.  We ascribe the differences in world view to error, rather than to unexamined but consistent and internally justified choice.</p>
  −
|-
  −
| align="right" | &mdash; Herbert J. Bernstein, "Idols of Modern Science", [HJB, 38]
  −
|}
     −
In this Subsection, I discuss the ''mechanics'' of parsing the cactus language into the corresponding class of computational data structuresThis provides each sentence of the language with a translation into a computational form that articulates its syntactic structure and prepares it for automated modes of processing and evaluation.  For this purpose, it is necessary to describe the target data structures at a fairly high level of abstraction only, ignoring the details of address pointers and record structures and leaving the more operational aspects of implementation to the imagination of prospective programmers.  In this way, I can put off to another stage of elaboration and refinement the description of the program that constructs these pointers and operates on these graph-theoretic data structures.
+
Formal language theory can seem like an awfully picky subject at times, treating every symbol as a thing in itself the way it does, sorting out the nominal types of symbols as objects in themselves, and singling out the passing tokens of symbols as distinct entities in their own rights.  It has to continue doing this, if not for any better reason than to aid in clarifying the kinds of languages that people are accustomed to use, to assist in writing computer programs that are capable of parsing real sentences, and to serve in designing programming languages that people would like to become accustomed to useAs a matter of fact, the only time that formal language theory becomes too picky, or a bit too myopic in its focus, is when it leads one to think that one is dealing with the thing itself and not just with the sign of it, in other words, when the people who use the tools of formal language theory forget that they are dealing with the mere signs of more interesting objects and not with the objects of ultimate interest in and of themselves.
   −
The structure of a ''painted cactus'', insofar as it presents itself to the visual imagination, can be described as follows.  The overall structure, as given by its underlying graph, falls within the species of graph that is commonly known as a ''rooted cactus'', and the only novel feature that it adds to this is that each of its nodes can be ''painted'' with a finite sequence of ''paints'', chosen from a ''palette'' that is given by the parametric set <math>\{ \, ^{\backprime\backprime} \operatorname{~} ^{\prime\prime} \, \} \cup \mathfrak{P} = \{ m_1 \} \cup \{ p_1, \ldots, p_k \}.</math>
+
As a result, there a number of deleterious effects that can arise from the extreme pickiness of formal language theory, arising, as is often the case, when formal theorists forget the practical context of theorization.  It frequently happens that the exacting task of defining the membership of a formal language leads one to think that this object and this object alone is the justifiable end of the whole exercise.  The distractions of this mediate objective render one liable to forget that one's penultimate interest lies always with various kinds of equivalence classes of signs, not entirely or exclusively with their more meticulous representatives.
   −
It is conceivable, from a purely graph-theoretical point of view, to have a class of cacti that are painted but not rooted, and so it is frequently necessary, for the sake of precision, to more exactly pinpoint the target species of graphical structure as a ''painted and rooted cactus'' (PARC).
+
When this happens, one typically goes on working oblivious to the fact that many details about what transpires in the meantime do not matter at all in the end, and one is likely to remain in blissful ignorance of the circumstance that many special details of language membership are bound, destined, and pre-determined to be glossed over with some measure of indifference, especially when it comes down to the final constitution of those equivalence classes of signs that are able to answer for the genuine objects of the whole enterprise of language.  When any form of theory, against its initial and its best intentions, leads to this kind of absence of mind that is no longer beneficial in all of its main effects, the situation calls for an antidotal form of theory, one that can restore the presence of mind that all forms of theory are meant to augment.
   −
A painted cactus, as a rooted graph, has a distinguished node that is called its ''root''By starting from the root and working recursively, the rest of its structure can be described in the following fashion.
+
The pragmatic theory of sign relations is called for in settings where everything that can be named has many other names, that is to say, in the usual caseOf course, one would like to replace this superfluous multiplicity of signs with an organized system of canonical signs, one for each object that needs to be denoted, but reducing the redundancy too far, beyond what is necessary to eliminate the factor of "noise" in the language, that is, to clear up its effectively useless distractions, can destroy the very utility of a typical language, which is intended to provide a ready means to express a present situation, clear or not, and to describe an ongoing condition of experience in just the way that it seems to present itself.  Within this fleshed out framework of language, moreover, the process of transforming the manifestations of a sign from its ordinary appearance to its canonical aspect is the whole problem of computation in a nutshell.
   −
Each ''node'' of a PARC consists of a graphical ''point'' or ''vertex'' plus a finite sequence of ''attachments'', described in relative terms as the attachments ''at'' or ''to'' that node.  An empty sequence of attachments defines the ''empty node''.  Otherwise, each attachment is one of three kinds:  a blank, a paint, or a type of PARC that is called a ''lobe''.
+
It is a well-known truth, but an often forgotten fact, that nobody computes with numbers, but solely with numerals in respect of numbers, and numerals themselves are symbols.  Among other things, this renders all discussion of numeric versus symbolic computation a bit beside the point, since it is only a question of what kinds of symbols are best for one's immediate application or for one's selection of ongoing objectives.  The numerals that everybody knows best are just the canonical symbols, the standard signs or the normal terms for numbers, and the process of computation is a matter of getting from the arbitrarily obscure signs that the data of a situation are capable of throwing one's way to the indications of its character that are clear enough to motivate action.
   −
Each ''lobe'' of a PARC consists of a directed graphical ''cycle'' plus a finite sequence of ''accoutrements'', described in relative terms as the accoutrements ''of'' or ''on'' that lobeRecalling the circumstance that every lobe that comes under consideration comes already attached to a particular node, exactly one vertex of the corresponding cycle is the vertex that comes from that very node. The remaining vertices of the cycle have their definitions filled out according to the accoutrements of the lobe in question.  An empty sequence of accoutrements is taken to be tantamount to a sequence that contains a single empty node as its unique accoutrement, and either one of these ways of approaching it can be regarded as defining a graphical structure that is called a ''needle'' or a ''terminal edge''.  Otherwise, each accoutrement of a lobe is itself an arbitrary PARC.
+
Having broached the distinction between propositions and sentences, one can see its similarity to the distinction between numbers and numeralsWhat are the implications of the foregoing considerations for reasoning about propositions and for the realm of reckonings in sentential logic? If the purpose of a sentence is just to denote a proposition, then the proposition is just the object of whatever sign is taken for a sentence.  This means that the computational manifestation of a piece of reasoning about propositions amounts to a process that takes place entirely within a language of sentences, a procedure that can rationalize its account by referring to the denominations of these sentences among propositions.
   −
Although this definition of a lobe in terms of its intrinsic structural components is logically sufficient, it is also useful to characterize the structure of a lobe in comparative terms, that is, to view the structure that typifies a lobe in relation to the structures of other PARC's and to mark the inclusion of this special type within the general run of PARC'sThis approach to the question of types results in a form of description that appears to be a bit more analytic, at least, in mnemonic or prima facie terms, if not ultimately more revealingWorking in this vein, a ''lobe'' can be characterized as a special type of PARC that is called an ''unpainted root plant'' (UR-plant).
+
The application of these considerations in the immediate setting is this:  Do not worry too much about what roles the empty string <math>\varepsilon \, = \, ^{\backprime\backprime\prime\prime}</math> and the blank symbol <math>m_1 \, = \, ^{\backprime\backprime} \operatorname{~} ^{\prime\prime}</math> are supposed to play in a given species of formal languages.  As it happens, it is far less important to wonder whether these types of formal tokens actually constitute genuine sentences than it is to decide what equivalence classes it makes sense to form over all of the sentences in the resulting language, and only then to bother about what equivalence classes these limiting cases of sentences are most conveniently taken to represent.
 +
 
 +
These concerns about boundary conditions betray a more general issue.  Already by this point in discussion the limits of the purely syntactic approach to a language are beginning to be visible.  It is not that one cannot go a whole lot further by this road in the analysis of a particular language and in the study of languages in general, but when it comes to the questions of understanding the purpose of a language, of extending its usage in a chosen direction, or of designing a language for a particular set of uses, what matters above all else are the ''pragmatic equivalence classes'' of signs that are demanded by the application and intended by the designer, and not so much the peculiar characters of the signs that represent these classes of practical meaning.
 +
 
 +
Any description of a language is bound to have alternative descriptions.  More precisely, a circumscribed description of a formal language, as any effectively finite description is bound to be, is certain to suggest the equally likely existence and the possible utility of other descriptions.  A single formal grammar describes but a single formal language, but any formal language is described by many different formal grammars, not all of which afford the same grasp of its structure, provide an equivalent comprehension of its character, or yield an interchangeable view of its aspectsConsequently, even with respect to the same formal language, different formal grammars are typically better for different purposes.
 +
 
 +
With the distinctions that evolve among the different styles of grammar, and with the preferences that different observers display toward them, there naturally comes the question:  What is the root of this evolution?
 +
 
 +
One dimension of variation in the styles of formal grammars can be seen by treating the union of languages, and especially the disjoint union of languages, as a ''sum'', by treating the concatenation of languages as a ''product'', and then by distinguishing the styles of analysis that favor ''sums of products'' from those that favor ''products of sums'' as their canonical forms of description.  If one examines the relation between languages and grammars carefully enough to see the presence and the influence of these different styles, and when one comes to appreciate the ways that different styles of grammars can be used with different degrees of success for different purposes, then one begins to see the possibility that alternative styles of description can be based on altogether different linguistic and logical operations.
 +
 
 +
It possible to trace this divergence of styles to an even more primitive division, one that distinguishes the ''additive'' or the ''parallel'' styles from the ''multiplicative'' or the ''serial'' styles.  The issue is somewhat confused by the fact that an ''additive'' analysis is typically expressed in the form of a ''series'', in other words, a disjoint union of sets or a
 +
linear sum of their independent effects.  But it is easy enough to sort this out if one observes the more telling connection between ''parallel'' and ''independent''Another way to keep the right associations straight is to employ the term ''sequential'' in preference to the more misleading term ''serial''.  Whatever one calls this broad division of styles, the scope and sweep of their dimensions of variation can be delineated in the following way:
 +
 
 +
# The ''additive'' or ''parallel'' styles favor ''sums of products'' <math>(\textstyle\sum\prod)</math> as canonical forms of expression, pulling sums, unions, co-products, and logical disjunctions to the outermost layers of analysis and synthesis, while pushing products, intersections, concatenations, and logical conjunctions to the innermost levels of articulation and generation.  In propositional logic, this style leads to the ''disjunctive normal form'' (DNF).
 +
# The ''multiplicative'' or ''serial'' styles favor ''products of sums'' <math>(\textstyle\prod\sum)</math> as canonical forms of expression, pulling products, intersections, concatenations, and logical conjunctions to the outermost layers of analysis and synthesis, while pushing sums, unions, co-products, and logical disjunctions to the innermost levels of articulation and generation.  In propositional logic, this style leads to the ''conjunctive normal form'' (CNF).
   −
An ''UR-plant'' is a PARC of a simpler sort, at least, with respect to the recursive ordering of structures that is being followed hereAs a type, it is defined by the presence of two properties, that of being ''planted'' and that of having an ''unpainted root''. These are defined as follows:
+
There is a curious sort of diagnostic clue that often serves to reveal the dominance of one mode or the other within an individual thinker's cognitive styleExamined on the question of what constitutes the ''natural numbers'', an ''additive'' thinker tends to start the sequence at 0, while a ''multiplicative'' thinker tends to regard it as beginning at 1.
   −
# A PARC is ''planted'' if its list of attachments has just one PARC.
+
In any style of description, grammar, or theory of a language, it is usually possible to tease out the influence of these contrasting traits, namely, the ''additive'' attitude versus the ''mutiplicative'' tendency that go to make up the particular style in question, and even to determine the dominant inclination or point of view that establishes its perspective on the target domain.
# A PARC is ''UR'' if its list of attachments has no blanks or paints.
     −
In short, an UR-planted PARC has a single PARC as its only attachment, and since this attachment is prevented from being a blank or a paint, the single attachment at its root has to be another sort of structure, that which we call a ''lobe''.
+
In each style of formal grammar, the ''multiplicative'' aspect is present in the sequential concatenation of signs, both in the augmented strings and in the terminal strings.  In settings where the non-terminal symbols classify types of strings, the concatenation of the non-terminal symbols signifies the cartesian product over the corresponding sets of strings.
   −
To express the description of a PARC in terms of its nodes, each node can be specified in the fashion of a functional expression, letting a citation of the generic function name "<math>\operatorname{Node}</math>" be followed by a list of arguments that enumerates the attachments of the node in question, and letting a citation of the generic function name "<math>\operatorname{Lobe}</math>" be followed by a list of arguments that details the accoutrements of the lobe in question.  Thus, one can write expressions of the following forms:
+
In the context-free style of formal grammar, the ''additive'' aspect is easy enough to spot.  It is signaled by the parallel covering of many augmented strings or sentential forms by the same non-terminal symbol.  Expressed in active terms, this calls for the independent rewriting of that non-terminal symbol by a number of different successors, as in the following scheme:
   −
{| align="center" cellpadding="4" width="90%"
+
{| align="center" cellpadding="8" width="90%"
| <math>1.\!</math>
+
|
| <math>\operatorname{Node}^0</math>
+
<math>\begin{matrix}
| <math>=\!</math>
+
q & :> & W_1 \\
| <math>\operatorname{Node}()</math>
+
\\
|-
+
\cdots & \cdots & \cdots \\
| &nbsp;
+
\\
| &nbsp;
+
q & :> & W_k \\
| <math>=\!</math>
+
\end{matrix}</math>
| a node with no attachments.
+
|}
|-
+
 
 +
It is useful to examine the relationship between the grammatical covering or production relation <math>(:>\!)</math> and the logical relation of implication <math>(\Rightarrow),</math> with one eye to what they have in common and one eye to how they differ.  The production <math>q :> W\!</math> says that the appearance of the symbol <math>q\!</math> in a sentential form implies the possibility of exchanging it for <math>W.\!</math> Although this sounds like a ''possible implication'', to the extent that ''<math>q\!</math> implies a possible <math>W\!</math>'' or that ''<math>q\!</math> possibly implies <math>W,\!</math>'' the qualifiers ''possible'' and ''possibly'' are the critical elements in these statements, and they are crucial to the meaning of what is actually being implied.  In effect, these qualifications reverse the direction of implication, yielding <math>^{\backprime\backprime} \, q \Leftarrow W \, ^{\prime\prime}</math> as the best analogue for the sense of the production.
 +
 
 +
One way to sum this up is to say that non-terminal symbols have the significance of hypotheses.  The terminal strings form the empirical matter of a language, while the non-terminal symbols mark the patterns or the types of substrings that can be noticed in the profusion of data.  If one observes a portion of a terminal string that falls into the pattern of the sentential form <math>W,\!</math> then it is an admissible hypothesis, according to the theory of the language that is constituted by the formal grammar, that this piece not only fits the type <math>q\!</math> but even comes to be generated under the auspices of the non-terminal symbol <math>^{\backprime\backprime} q ^{\prime\prime}.</math>
 +
 
 +
A moment's reflection on the issue of style, giving due consideration to the received array of stylistic choices, ought to inspire at least the question:  "Are these the only choices there are?"  In the present setting, there are abundant indications that other options, more differentiated varieties of description and more integrated ways of approaching individual languages, are likely to be conceivable, feasible, and even more ultimately viable.  If a suitably generic style, one that incorporates the full scope of logical combinations and operations, is broadly available, then it would no longer be necessary, or even apt, to argue in universal terms about which style is best, but more useful to investigate how we might adapt the local styles to the local requirements.  The medium of a generic style would yield a viable compromise between additive and multiplicative canons, and render the choice between parallel and serial a false alternative, at least, when expressed in the globally exclusive terms that are currently most commonly adopted to pose it.
 +
 
 +
One set of indications comes from the study of machines, languages, and computation, especially the theories of their structures and relations.  The forms of composition and decomposition that are generally known as ''parallel'' and ''serial'' are merely the extreme special cases, in variant directions of specialization, of a more generic form, usually called the ''cascade'' form of combination.  This is a well-known fact in the theories that deal with automata and their associated formal languages, but its implications do not seem to be widely appreciated outside these fields.  In particular, it dispells the need to choose one extreme or the other, since most of the natural cases are likely to exist somewhere in between.
 +
 
 +
Another set of indications appears in algebra and category theory, where forms of composition and decomposition related to the cascade combination, namely, the ''semi-direct product'' and its special case, the ''wreath product'', are encountered at higher levels of generality than the cartesian products of sets or the direct products of spaces.
 +
 
 +
In these domains of operation, one finds it necessary to consider also the ''co-product'' of sets and spaces, a construction that artificially creates a disjoint union of sets, that is, a union of spaces that are being treated as independent.  It does this, in effect, by ''indexing'',
 +
''coloring'', or ''preparing'' the otherwise possibly overlapping domains that are being combined.  What renders this a ''chimera'' or a ''hybrid'' form of combination is the fact that this indexing is tantamount to a cartesian product of a singleton set, namely, the conventional ''index'', ''color'', or ''affix'' in question, with the individual domain that is entering as a factor, a term, or a participant in the final result.
 +
 
 +
One of the insights that arises out of Peirce's logical work is that the set operations of complementation, intersection, and union, along with the logical operations of negation, conjunction, and disjunction that operate in isomorphic tandem with them, are not as fundamental as they first appear.  This is because all of them can be constructed from or derived from a smaller set of operations, in fact, taking the logical side of things, from either one of two ''sole sufficient'' operators, called ''amphecks'' by Peirce, ''strokes'' by those who re-discovered them later, and known in computer science as the NAND and the NNOR operators.  For this reason, that is, by virtue of their precedence in the orders of construction and derivation, these operations have to be regarded as the simplest and the most primitive in principle, even if they are scarcely recognized as lying among the more familiar elements of logic.
 +
 
 +
I am throwing together a wide variety of different operations into each of the bins labeled ''additive'' and ''multiplicative'', but it is easy to observe a natural organization and even some relations approaching isomorphisms among and between the members of each class.
 +
 
 +
The relation between logical disjunction and set-theoretic union and the relation between logical conjunction and set-theoretic intersection ought to be clear enough for the purposes of the immediately present context.  In any case, all of these relations are scheduled to receive a thorough examination in a subsequent discussion (Subsection 1.3.10.13).  But the relation of a set-theoretic union to a category-theoretic co-product and the relation of a set-theoretic intersection to a syntactic concatenation deserve a closer look at this point.
 +
 
 +
The effect of a co-product as a ''disjointed union'', in other words, that creates an object tantamount to a disjoint union of sets in the resulting co-product even if some of these sets intersect non-trivially and even if some of them are identical ''in reality'', can be achieved in several ways.  The most usual conception is that of making a ''separate copy'', for each part of the intended co-product, of the set that is intended to go there.  Often one thinks of the set that is assigned to a particular part of the co-product as being distinguished by a particular ''color'', in other words, by the attachment of a distinct ''index'', ''label'', or ''tag'', being a marker that is inherited by and passed on to every element of the set in that part.  A concrete image of this construction can be achieved by imagining that each set and each element of each set is placed in an ordered pair with the sign of its color, index, label, or tag.  One describes this as the ''injection'' of each set into the corresponding ''part'' of the co-product.
 +
 
 +
For example, given the sets <math>P\!</math> and <math>Q,\!</math> overlapping or not, one can define the ''indexed'' or ''marked'' sets <math>P_{[1]}\!</math> and <math>Q_{[2]},\!</math> amounting to the copy of <math>P\!</math> into the first part of the co-product and the copy of <math>Q\!</math> into the second part of the co-product, in the following manner:
 +
 
 +
{| align="center" cellpsadding="8" width="90%"
 +
|
 +
<math>\begin{array}{lllll}
 +
P_{[1]} & = & (P, 1) & = & \{ (x, 1) : x \in P \}, \\
 +
Q_{[2]} & = & (Q, 2) & = & \{ (x, 2) : x \in Q \}. \\
 +
\end{array}</math>
 +
|}
 +
 
 +
Using the coproduct operator (<math>\textstyle\coprod</math>) for this construction, the ''sum'', the ''coproduct'', or the ''disjointed union'' of <math>P\!</math> and <math>Q\!</math> in that order can be represented as the ordinary union of <math>P_{[1]}\!</math> and <math>Q_{[2]}.\!</math>
 +
 
 +
{| align="center" cellpsadding="8" width="90%"
 +
|
 +
<math>\begin{array}{lll}
 +
P \coprod Q & = & P_{[1]} \cup Q_{[2]}. \\
 +
\end{array}</math>
 +
|}
 +
 
 +
The concatenation <math>\mathfrak{L}_1 \cdot \mathfrak{L}_2</math> of the formal languages <math>\mathfrak{L}_1\!</math> and <math>\mathfrak{L}_2\!</math> is just the cartesian product of sets <math>\mathfrak{L}_1 \times \mathfrak{L}_2</math> without the extra <math>\times\!</math>'s, but the relation of cartesian products to set-theoretic intersections and thus to logical conjunctions is far from being clear.  One way of seeing a type of relation is to focus on the information that is needed to specify each construction, and thus to reflect on the signs that are used to carry this information.  As a first approach to the topic of information, according to a strategy that seeks to be as elementary and as informal as possible, I introduce the following set of ideas, intended to be taken in a very provisional way.
 +
 
 +
A ''stricture'' is a specification of a certain set in a certain place, relative to a number of other sets, yet to be specified.  It is assumed that one knows enough to tell if two strictures are equivalent as pieces of information, but any more determinate indications, like names for the places that are mentioned in the stricture, or bounds on the number of places that are involved, are regarded as being extraneous impositions, outside the proper concern of the definition, no matter how convenient they are found to be for a particular discussion.  As a schematic form of illustration, a stricture can be pictured in the following shape:
 +
 
 +
:{| cellpadding="8"
 +
| <math>^{\backprime\backprime}</math>
 +
| <math>\ldots \times X \times Q \times X \times \ldots</math>
 +
| <math>^{\prime\prime}</math>
 +
|}
 +
 
 +
A ''strait'' is the object that is specified by a stricture, in effect, a certain set in a certain place of an otherwise yet to be specified relation. Somewhat sketchily, the strait that corresponds to the stricture just given can be pictured in the following shape:
 +
 
 +
:{| cellpadding="8"
 +
| &nbsp;
 +
| <math>\ldots \times X \times Q \times X \times \ldots</math>
 
| &nbsp;
 
| &nbsp;
| <math>\operatorname{Node}_{j=1}^k C_j</math>
  −
| <math>=\!</math>
  −
| <math>\operatorname{Node} (C_1, \ldots, C_k)</math>
  −
|-
  −
| &nbsp;
  −
| &nbsp;
  −
| <math>=\!</math>
  −
| a node with the attachments <math>C_1, \ldots, C_k.</math>
  −
|-
  −
| <math>2.\!</math>
  −
| <math>\operatorname{Lobe}^0</math>
  −
| <math>=\!</math>
  −
| <math>\operatorname{Lobe}()</math>
  −
|-
  −
| &nbsp;
  −
| &nbsp;
  −
| <math>=\!</math>
  −
| a lobe with no accoutrements.
  −
|-
  −
| &nbsp;
  −
| <math>\operatorname{Lobe}_{j=1}^k C_j</math>
  −
| <math>=\!</math>
  −
| <math>\operatorname{Lobe} (C_1, \ldots, C_k)</math>
  −
|-
  −
| &nbsp;
  −
| &nbsp;
  −
| <math>=\!</math>
  −
| a lobe with the accoutrements <math>C_1, \ldots, C_k.</math>
   
|}
 
|}
   −
<pre>
+
In this picture <math>Q\!</math> is a certain set and <math>X\!</math> is the universe of discourse that is relevant to a given discussion.  Since a stricture does not, by itself, contain a sufficient amount of information to specify the number of sets that it intends to set in place, or even to specify the absolute location of the set that its does set in place, it appears to place an unspecified number of unspecified sets in a vague and uncertain straitTaken out of its interpretive context, the residual information that a stricture can convey makes all of the following potentially equivalent as strictures:
Working from a structural description of the cactus language,
  −
or any suitable formal grammar for !C!(!P!), it is possible to
  −
give a recursive definition of the function called "Parse" that
  −
maps each sentence in PARCE(!P!) to the corresponding graph in
  −
PARC(!P!)One way to do this proceeds as follows:
     −
1.  The parse of the concatenation Conc^k of the k sentences S_j,
+
{| align="center" cellpadding="8" width="90%"
    for j = 1 to k, is defined recursively as follows:
+
|
 
+
<math>\begin{array}{ccccccc}
    a.  Parse(Conc^0)        =  Node^0.
+
^{\backprime\backprime} Q ^{\prime\prime}
 +
& , &
 +
^{\backprime\backprime} X \times Q \times X ^{\prime\prime}
 +
& , &
 +
^{\backprime\backprime} X \times X \times Q \times X \times X ^{\prime\prime}
 +
& , &
 +
\ldots
 +
\\
 +
\end{array}</math>
 +
|}
   −
    b.  For k > 0,
+
With respect to what these strictures specify, this leaves all of the following equivalent as straits:
   −
        Parse(Conc^k_j S_j)  = Node^k_j Parse(S_j).
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{array}{ccccccc}
 +
Q
 +
& = &
 +
X \times Q \times X
 +
& = &
 +
X \times X \times Q \times X \times X
 +
& = &
 +
\ldots
 +
\\
 +
\end{array}</math>
 +
|}
   −
2.  The parse of the surcatenation Surc^k of the k sentences S_j,
+
Within the framework of a particular discussion, it is customary to set a bound on the number of places and to limit the variety of sets that are regarded as being under active consideration, and it is also convenient to index the places of the indicated relations, and of their encompassing cartesian products, in some fixed way.  But the whole idea of a stricture is to specify a strait that is capable of extending through and beyond any fixed frame of discussion.  In other words, a stricture is conceived to constrain a strait at a certain point, and then to leave it literally embedded, if tacitly expressed, in a yet to be fully specified relation, one that involves an unspecified number of unspecified domains.
    for j = 1 to k, is defined recursively as follows:
     −
    a.  Parse(Surc^0)        = Lobe^0.
+
A quantity of information is a measure of constraintIn this respect, a set of comparable strictures is ordered on account of the information that each one conveys, and a system of comparable straits is ordered in accord with the amount of information that it takes to pin each one of them down. Strictures that are more constraining and straits that are more constrained are placed at higher levels of information than those that are less so, and entities that involve more information are said to have a greater ''complexity'' in comparison with those entities that involve less information, that are said to have a greater ''simplicity''.
   −
    bFor k > 0,
+
In order to create a concrete example, let me now institute a frame of discussion where the number of places in a relation is bounded at two, and where the variety of sets under active consideration is limited to the typical subsets <math>P\!</math> and <math>Q\!</math> of a universe <math>X.\!</math> Under these conditions, one can use the following sorts of expression as schematic strictures:
   −
        Parse(Surc^k_j S_j)  =  Lobe^k_j Parse(S_j).
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{matrix}
 +
^{\backprime\backprime} X ^{\prime\prime} &
 +
^{\backprime\backprime} P ^{\prime\prime} &
 +
^{\backprime\backprime} Q ^{\prime\prime} \\
 +
\\
 +
^{\backprime\backprime} X \times X ^{\prime\prime} &
 +
^{\backprime\backprime} X \times P ^{\prime\prime} &
 +
^{\backprime\backprime} X \times Q ^{\prime\prime} \\
 +
\\
 +
^{\backprime\backprime} P \times X ^{\prime\prime} &
 +
^{\backprime\backprime} P \times P ^{\prime\prime} &
 +
^{\backprime\backprime} P \times Q ^{\prime\prime} \\
 +
\\
 +
^{\backprime\backprime} Q \times X ^{\prime\prime} &
 +
^{\backprime\backprime} Q \times P ^{\prime\prime} &
 +
^{\backprime\backprime} Q \times Q ^{\prime\prime} \\
 +
\end{matrix}</math>
 +
|}
   −
For ease of reference, Table 12 summarizes the mechanics of these parsing rules.
+
These strictures and their corresponding straits are stratified according to their amounts of information, or their levels of constraint, as follows:
   −
Table 12.  Algorithmic Translation Rules
+
{| align="center" cellpadding="8" width="90%"
o------------------------o---------o------------------------o
+
|
|                        |  Parse  |                        |
+
<math>\begin{array}{lcccc}
| Sentence in PARCE      |  -->   | Graph in PARC          |
+
\text{High:}
o------------------------o---------o------------------------o
+
& ^{\backprime\backprime} P \times P ^{\prime\prime}
|                        |        |                        |
+
& ^{\backprime\backprime} P \times Q ^{\prime\prime}
| Conc^0                |  -->  | Node^0                |
+
& ^{\backprime\backprime} Q \times P ^{\prime\prime}
|                        |        |                        |
+
& ^{\backprime\backprime} Q \times Q ^{\prime\prime}
| Conc^k_j  S_j          |  -->  | Node^k_j  Parse(S_j)  |
+
\\
|                        |        |                        |
+
\\
| Surc^0                |  -->  | Lobe^0                |
+
\text{Med:}
|                        |        |                        |
+
& ^{\backprime\backprime} P ^{\prime\prime}
| Surc^k_j  S_j          |  -->   | Lobe^k_j  Parse(S_j)  |
+
& ^{\backprime\backprime} X \times P ^{\prime\prime}
|                       |        |                        |
+
& ^{\backprime\backprime} P \times X ^{\prime\prime}
o------------------------o---------o------------------------o
+
\\
 +
\\
 +
\text{Med:}
 +
& ^{\backprime\backprime} Q ^{\prime\prime}
 +
& ^{\backprime\backprime} X \times Q ^{\prime\prime}
 +
& ^{\backprime\backprime} Q \times X ^{\prime\prime}
 +
\\
 +
\\
 +
\text{Low:}
 +
& ^{\backprime\backprime} X ^{\prime\prime}
 +
& ^{\backprime\backprime} X \times X ^{\prime\prime}
 +
\\
 +
\end{array}</math>
 +
|}
   −
A "substructure" of a PARC is defined recursively as follows.  Starting
+
Within this framework, the more complex strait <math>P \times Q</math> can be expressed
at the root node of the cactus C, any attachment is a substructure of C.
+
in terms of the simpler straits, <math>P \times X</math> and <math>X \times Q.</math>  More specifically, it lends itself to being analyzed as their intersection, in the following way:
If a substructure is a blank or a paint, then it constitutes a minimal
  −
substructure, meaning that no further substructures of C arise from it.
  −
If a substructure is a lobe, then each one of its accoutrements is also
  −
a substructure of C, and has to be examined for further substructures.
     −
The concept of substructure can be used to define varieties of deletion
+
{| align="center" cellpadding="8" width="90%"
and erasure operations that respect the structure of the abstract graph.
+
|
For the purposes of this depiction, a blank symbol " " is treated as
+
<math>\begin{array}{lllll}
a "primer", in other words, as a "clear paint", a "neutral tint", or
+
P \times Q & = & P \times X & \cap & X \times Q. \\
a "white wash"In effect, one is letting m_1 = p_0.  In this frame
+
\end{array}</math>
of discussion, it is useful to make the following distinction:
+
|}
 
+
 
1. To "delete" a substructure is to replace it with an empty node,
+
From here it is easy to see the relation of concatenation, by virtue of these types of intersection, to the logical conjunction of propositionsThe cartesian product <math>P \times Q</math> is described by a conjunction of propositions, namely, <math>P_{[1]} \land Q_{[2]},</math> subject to the following interpretation:
    in effect, to reduce the whole structure to a trivial point.
+
 
 +
# <math>P_{[1]}\!</math> asserts that there is an element from the set <math>P\!</math> in the first place of the product.
 +
# <math>Q_{[2]}\!</math> asserts that there is an element from the set <math>Q\!</math> in the second place of the product.
   −
2.  To "erase" a substructure is to replace it with a blank symbol,
+
The integration of these two pieces of information can be taken in that measure to specify a yet to be fully determined relation.
    in effect, to paint it out of the picture or to overwrite it.
     −
A "bare" PARC, loosely referred to as a "bare cactus", is a PARC on the
+
In a corresponding fashion at the level of the elements, the ordered pair <math>(p, q)\!</math> is described by a conjunction of propositions, namely, <math>p_{[1]} \land q_{[2]},</math> subject to the following interpretation:
empty palette !P! = {}.  In other veins, a bare cactus can be described
  −
in several different ways, depending on how the form arises in practice.
     −
1.  Leaning on the definition of a bare PARCE, a bare PARC can be
+
# <math>p_{[1]}\!</math> says that <math>p\!</math> is in the first place of the product element under construction.
    described as the kind of a parse graph that results from parsing
+
# <math>q_{[2]}\!</math> says that <math>q\!</math> is in the second place of the product element under construction.
    a bare cactus expression, in other words, as the kind of a graph
  −
    that issues from the requirements of processing a sentence of
  −
    the bare cactus language !C!^0 = PARCE^0.
     −
2.  To express it more in its own terms, a bare PARC can be defined
+
Notice that, in construing the cartesian product of the sets <math>P\!</math> and <math>Q\!</math> or the concatenation of the languages <math>\mathfrak{L}_1\!</math> and <math>\mathfrak{L}_2\!</math> in this way, one shifts the level of the active construction from the tupling of the elements in <math>P\!</math> and <math>Q\!</math> or the concatenation of the strings that are internal to the languages <math>\mathfrak{L}_1\!</math> and <math>\mathfrak{L}_2\!</math> to the concatenation of the external signs that it takes to indicate these sets or these languages, in other words, passing to a conjunction of indexed propositions, <math>P_{[1]}\!</math> and <math>Q_{[2]},\!</math> or to a conjunction of assertions, <math>(\mathfrak{L}_1)_{[1]}</math> and <math>(\mathfrak{L}_2)_{[2]},</math> that marks the sets or the languages in question for insertion in the indicated places of a product set or a product language, respectively.  In effect, the subscripting by the indices <math>^{\backprime\backprime} [1] ^{\prime\prime}</math> and <math>^{\backprime\backprime} [2] ^{\prime\prime}</math> can be recognized as a special case of concatenation, albeit through the posting of editorial remarks from an external ''mark-up'' language.
    by tracing the recursive definition of a generic PARC, but then
  −
    by detaching an independent form of description from the source
  −
    of that analogy. The method is sufficiently sketched as follows:
     −
    a.  A "bare PARC" is a PARC whose attachments
+
In order to systematize the relations that strictures and straits placed at higher levels of complexity, constraint, information, and organization have with those that are placed at the associated lower levels, I introduce the following pair of definitions:
        are limited to blanks and "bare lobes".
     −
    b.  A "bare lobe" is a lobe whose accoutrements
+
The <math>j^\text{th}\!</math> ''excerpt'' of a stricture of the form <math>^{\backprime\backprime} \, S_1 \times \ldots \times S_k \, ^{\prime\prime},</math> regarded within a frame of discussion where the number of places is limited to <math>k,\!</math> is the stricture of the form <math>^{\backprime\backprime} \, X \times \ldots \times S_j \times \ldots \times X \, ^{\prime\prime}.</math>  In the proper context, this can be written more succinctly as the stricture <math>^{\backprime\backprime} \, (S_j)_{[j]} \, ^{\prime\prime},</math> an assertion that places the <math>j^\text{th}\!</math> set in the <math>j^\text{th}\!</math> place of the product.
        are limited to bare PARC's.
     −
3.  In practice, a bare cactus is usually encountered in the process
+
The <math>j^\text{th}\!</math> ''extract'' of a strait of the form <math>S_1 \times \ldots \times S_k,\!</math> constrained to a frame of discussion where the number of places is restricted to <math>k,\!</math> is the strait of the form <math>X \times \ldots \times S_j \times \ldots \times X.</math>  In the appropriate context, this can be denoted more succinctly by the stricture <math>^{\backprime\backprime} \, (S_j)_{[j]} \, ^{\prime\prime},</math> an assertion that places the <math>j^\text{th}\!</math> set in the <math>j^\text{th}\!</math> place of the product.
    of analyzing or handling an arbitrary PARC, the circumstances of
  −
    which frequently call for deleting or erasing all of its paints.
  −
    In particular, this generally makes it easier to observe the
  −
    various properties of its underlying graphical structure.
  −
</pre>
     −
==The Cactus Language : Semantics==
+
In these terms, a stricture of the form <math>^{\backprime\backprime} \, S_1 \times \ldots \times S_k \, ^{\prime\prime}</math> can be expressed in terms of simpler strictures, to wit, as a conjunction of its <math>k\!</math> excerpts:
   −
{| align="center" cellpadding="0" cellspacing="0" width="90%"
+
{| align="center" cellpadding="8" width="90%"
 
|
 
|
<p>Alas, and yet what ''are'' you, my written and painted thoughts!  It is not long ago that you were still so many-coloured, young and malicious, so full of thorns and hidden spices you made me sneeze and laugh &mdash; and now?  You have already taken off your novelty and some of you, I fear, are on the point of becoming truths:  they already look so immortal, so pathetically righteous, so boring!</p>
+
<math>\begin{array}{lll}
|-
+
^{\backprime\backprime} \, S_1 \times \ldots \times S_k \, ^{\prime\prime}
| align="right" | &mdash; Nietzsche, ''Beyond Good and Evil'', [Nie-2, ¶ 296]
+
& = &
 +
^{\backprime\backprime} \, (S_1)_{[1]} \, ^{\prime\prime}
 +
\, \land \, \ldots \, \land \,
 +
^{\backprime\backprime} \, (S_k)_{[k]} \, ^{\prime\prime}.
 +
\end{array}</math>
 
|}
 
|}
   −
In this Subsection, I describe a particular semantics for the painted cactus language, telling what meanings I aim to attach to its bare syntactic forms.  This supplies an ''interpretation'' for this parametric family of formal languages, but it is good to remember that it forms just one of many such interpretations that are conceivable and even viable.  In deed, the distinction between the object domain and the sign domain can be observed in the fact that many languages can be deployed to depict the same set of objects and that any language worth its salt is bound to to give rise to many different forms of interpretive saliency.
+
In a similar vein, a strait of the form <math>S_1 \times \ldots \times S_k\!</math> can be expressed in terms of simpler straits, namely, as an intersection of its <math>k\!</math> extracts:
   −
In formal settings, it is common to speak of interpretation as if it created a direct connection between the signs of a formal language and the objects of the intended domain, in other words, as if it determined the denotative component of a sign relation.  But a closer attention to what goes on reveals that the process of interpretation is more indirect, that what it does is to provide each sign of a prospectively meaningful source language with a translation into an already established target language, where ''already established'' means that its relationship to pragmatic objects is taken for granted at the moment in question.
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{array}{lll}
 +
S_1 \times \ldots \times S_k & = & (S_1)_{[1]} \, \cap \, \ldots \, \cap \, (S_k)_{[k]}.
 +
\end{array}</math>
 +
|}
   −
With this in mind, it is clear that interpretation is an affair of signs that at best respects the objects of all of the signs that enter into it, and so it is the connotative aspect of semiotics that is at stake here.  There is nothing wrong with my saying that I interpret a sentence of a formal language as a sign that refers to a function or to a proposition, so long as you understand that this reference is likely to be achieved by way of more familiar and perhaps less formal signs that you already take to denote those objects.
+
There is a measure of ambiguity that remains in this formulation, but it is the best that I can do in the present informal context.
   −
On entering a context where a logical interpretation is intended for the sentences of a formal language there are a few conventions that make it easier to make the translation from abstract syntactic forms to their intended semantic senses.  Although these conventions are expressed in unnecessarily colorful terms, from a purely abstract point of view, they do provide a useful array of connotations that help to negotiate what is otherwise a difficult transition.  This terminology is introduced as the need for it arises in the process of interpreting the cactus language.
+
===The Cactus Language : Mechanics===
   −
The task of this Subsection is to specify a ''semantic function'' for the sentences of the cactus language <math>\mathfrak{L} = \mathfrak{C}(\mathfrak{P}),</math> in other words, to define a mapping that "interprets" each sentence of <math>\mathfrak{C}(\mathfrak{P})</math> as a sentence that says something, as a sentence that bears a meaning, in short, as a sentence that denotes a proposition, and thus as a sign of an indicator functionWhen the syntactic sentences of a formal language are given a referent significance in logical terms, for example, as denoting propositions or indicator functions, then each form of syntactic combination takes on a corresponding form of logical significance.
+
{| align="center" cellpadding="0" cellspacing="0" width="90%"
 +
|
 +
<p>We are only now beginning to see how this works.  Clearly one of the mechanisms for picking a reality is the sociohistorical sense of what is important &mdash; which research program, with all its particularity of knowledge, seems most fundamental, most productive, most penetrating.  The very judgments which make us push narrowly forward simultaneously make us forget how little we know.  And when we look back at history, where the lesson is plain to find, we often fail to imagine ourselves in a parallel situationWe ascribe the differences in world view to error, rather than to unexamined but consistent and internally justified choice.</p>
 +
|-
 +
| align="right" | &mdash; Herbert J. Bernstein, "Idols of Modern Science", [HJB, 38]
 +
|}
   −
By way of providing a logical interpretation for the cactus language, I introduce a family of operators on indicator functions that are called ''propositional connectives'', and I distinguish these from the associated family of syntactic combinations that are called ''sentential connectives'', where the relationship between these two realms of connection is exactly that between objects and their signsA propositional connective, as an entity of a well-defined functional and operational type, can be treated in every way as a logical or a mathematical object, and thus as the type of object that can be denoted by the corresponding form of syntactic entity, namely, the sentential connective that is appropriate to the case in question.
+
In this Subsection, I discuss the ''mechanics'' of parsing the cactus language into the corresponding class of computational data structuresThis provides each sentence of the language with a translation into a computational form that articulates its syntactic structure and prepares it for automated modes of processing and evaluation.  For this purpose, it is necessary to describe the target data structures at a fairly high level of abstraction only, ignoring the details of address pointers and record structures and leaving the more operational aspects of implementation to the imagination of prospective programmers.  In this way, I can put off to another stage of elaboration and refinement the description of the program that constructs these pointers and operates on these graph-theoretic data structures.
   −
There are two basic types of connectives, called the ''blank connectives'' and the ''bound connectives'', respectively, with one connective of each type for each natural number k = 0, 1, 2, 3, ... .
+
The structure of a ''painted cactus'', insofar as it presents itself to the visual imagination, can be described as follows.  The overall structure, as given by its underlying graph, falls within the species of graph that is commonly known as a ''rooted cactus'', and the only novel feature that it adds to this is that each of its nodes can be ''painted'' with a finite sequence of ''paints'', chosen from a ''palette'' that is given by the parametric set <math>\{ \, ^{\backprime\backprime} \operatorname{~} ^{\prime\prime} \, \} \cup \mathfrak{P} = \{ m_1 \} \cup \{ p_1, \ldots, p_k \}.</math>
   −
<pre>
+
It is conceivable, from a purely graph-theoretical point of view, to have a class of cacti that are painted but not rooted, and so it is frequently necessary, for the sake of precision, to more exactly pinpoint the target species of graphical structure as a ''painted and rooted cactus'' (PARC).
1.  The "blank connective" of k places is signified by the
  −
    concatenation of the k sentences that fill those places.
     −
    For the special case of k = 0, the "blank connective" is taken to
+
A painted cactus, as a rooted graph, has a distinguished node that is called its ''root''.  By starting from the root and working recursively, the rest of its structure can be described in the following fashion.
    be an empty string or a blank symbol -- it does not matter which,
  −
    since both are assigned the same denotation among propositions.
  −
    For the generic case of k > 0, the "blank connective" takes
  −
    the form "S_1 · ... · S_k".  In the type of data that is
  −
    called a "text", the raised dots "·" are usually omitted,
  −
    supplanted by whatever number of spaces and line breaks
  −
    serve to improve the readability of the resulting text.
     −
2The "bound connective" of k places is signified by the
+
Each ''node'' of a PARC consists of a graphical ''point'' or ''vertex'' plus a finite sequence of ''attachments'', described in relative terms as the attachments ''at'' or ''to'' that nodeAn empty sequence of attachments defines the ''empty node''.  Otherwise, each attachment is one of three kinds:  a blank, a paint, or a type of PARC that is called a ''lobe''.
    surcatenation of the k sentences that fill those places.
     −
    For the special case of k = 0, the "bound connective" is taken to
+
Each ''lobe'' of a PARC consists of a directed graphical ''cycle'' plus a finite sequence of ''accoutrements'', described in relative terms as the accoutrements ''of'' or ''on'' that lobe.  Recalling the circumstance that every lobe that comes under consideration comes already attached to a particular node, exactly one vertex of the corresponding cycle is the vertex that comes from that very node. The remaining vertices of the cycle have their definitions filled out according to the accoutrements of the lobe in question. An empty sequence of accoutrements is taken to be tantamount to a sequence that contains a single empty node as its unique accoutrement, and either one of these ways of approaching it can be regarded as defining a graphical structure that is called a ''needle'' or a ''terminal edge''. Otherwise, each accoutrement of a lobe is itself an arbitrary PARC.
    be an expression of the form "-()-", "-( )-", "-( )-", and so on,
  −
    with any number of blank symbols between the parentheses, all of
  −
    which are assigned the same logical denotation among propositions.
  −
    For the generic case of k > 0, the "bound connective" takes the
  −
    form "-(S_1, ..., S_k)-".
     −
At this point, there are actually two different "dialects", "scripts",
+
Although this definition of a lobe in terms of its intrinsic structural components is logically sufficient, it is also useful to characterize the structure of a lobe in comparative terms, that is, to view the structure that typifies a lobe in relation to the structures of other PARC's and to mark the inclusion of this special type within the general run of PARC's.  This approach to the question of types results in a form of description that appears to be a bit more analytic, at least, in mnemonic or prima facie terms, if not ultimately more revealing.  Working in this vein, a ''lobe'' can be characterized as a special type of PARC that is called an ''unpainted root plant'' (UR-plant).
or "modes" of presentation for the cactus language that need to be
  −
interpreted, in other words, that need to have a semantic function
  −
defined on their domains.
     −
a.  There is the literal formal language of strings in PARCE(!P!),
+
An ''UR-plant'' is a PARC of a simpler sort, at least, with respect to the recursive ordering of structures that is being followed hereAs a type, it is defined by the presence of two properties, that of being ''planted'' and that of having an ''unpainted root''. These are defined as follows:
    the "painted and rooted cactus expressions" that constitute
  −
    the langauge !L! = !C!(!P!) c !A!* = (!M! |_| !P!)*.
     −
b.  There is the figurative formal language of graphs in PARC(!P!),
+
# A PARC is ''planted'' if its list of attachments has just one PARC.
    the "painted and rooted cacti" themselves, a parametric family
+
# A PARC is ''UR'' if its list of attachments has no blanks or paints.
    of graphs or a species of computational data structures that
  −
    is graphically analogous to the language of literal strings.
     −
Of course, these two modalities of formal language, like written and
+
In short, an UR-planted PARC has a single PARC as its only attachment, and since this attachment is prevented from being a blank or a paint, the single attachment at its root has to be another sort of structure, that which we call a ''lobe''.
spoken natural languages, are meant to have compatible interpretations,
  −
and so it is usually sufficient to give just the meanings of either one.
  −
All that remains is to provide a "codomain" or a "target space" for the
  −
intended semantic function, in other words, to supply a suitable range
  −
of logical meanings for the memberships of these languages to map into.
  −
Out of the many interpretations that are formally possible to arrange,
  −
one way of doing this proceeds by making the following definitions:
     −
1.  The "conjunction" Conj^J_j Q_j of a set of propositions, {Q_j : j in J},
+
To express the description of a PARC in terms of its nodes, each node can be specified in the fashion of a functional expression, letting a citation of the generic function name "<math>\operatorname{Node}</math>" be followed by a list of arguments that enumerates the attachments of the node in question, and letting a citation of the generic function name "<math>\operatorname{Lobe}</math>" be followed by a list of arguments that details the accoutrements of the lobe in question.  Thus, one can write expressions of the following forms:
    is a proposition that is true if and only if each one of the Q_j is true.
     −
    Conj^J_j Q_j is true  <=>  Q_j is true for every j in J.
+
{| align="center" cellpadding="4" width="90%"
 
+
| <math>1.\!</math>
2.  The "surjunction" Surj^J_j Q_j of a set of propositions, {Q_j : j in J},
+
| <math>\operatorname{Node}^0</math>
    is a proposition that is true if and only if just one of the Q_j is untrue.
+
| <math>=\!</math>
 
+
| <math>\operatorname{Node}()</math>
    Surj^J_j Q_j is true  <=> Q_j is untrue for unique j in J.
+
|-
 
+
| &nbsp;
If the number of propositions that are being joined together is finite,
+
| &nbsp;
then the conjunction and the surjunction can be represented by means of
+
| <math>=\!</math>
sentential connectives, incorporating the sentences that represent these
+
| a node with no attachments.
propositions into finite strings of symbols.
+
|-
 
+
| &nbsp;
If J is finite, for instance, if J constitutes the interval j = 1 to k,
+
| <math>\operatorname{Node}_{j=1}^k C_j</math>
and if each proposition Q_j is represented by a sentence S_j, then the
+
| <math>=\!</math>
following strategies of expression are open:
+
| <math>\operatorname{Node} (C_1, \ldots, C_k)</math>
 
+
|-
1. The conjunction Conj^J_j Q_j can be represented by a sentence that
+
| &nbsp;
    is constructed by concatenating the S_j in the following fashion:
+
| &nbsp;
 
+
| <math>=\!</math>
    Conj^J_j Q_j  <-<  S_1 S_2 ... S_k.
+
| a node with the attachments <math>C_1, \ldots, C_k.</math>
 
+
|-
2.  The surjunction Surj^J_j Q_j can be represented by a sentence that
+
| <math>2.\!</math>
    is constructed by surcatenating the S_j in the following fashion:
+
| <math>\operatorname{Lobe}^0</math>
 
+
| <math>=\!</math>
    Surj^J_j Q_j  <-<  -(S_1, S_2, ..., S_k)-.
+
| <math>\operatorname{Lobe}()</math>
 
+
|-
If one opts for a mode of interpretation that moves more directly from
+
| &nbsp;
the parse graph of a sentence to the potential logical meaning of both
+
| &nbsp;
the PARC and the PARCE, then the following specifications are in order:
+
| <math>=\!</math>
 
+
| a lobe with no accoutrements.
A cactus rooted at a particular node is taken to represent what that
+
|-
node denotes, its logical denotation or its logical interpretation.
+
| &nbsp;
 
+
| <math>\operatorname{Lobe}_{j=1}^k C_j</math>
1.  The logical denotation of a node is the logical conjunction of that node's
+
| <math>=\!</math>
    "arguments", which are defined as the logical denotations of that node's
+
| <math>\operatorname{Lobe} (C_1, \ldots, C_k)</math>
    attachments.  The logical denotation of either a blank symbol or an empty
+
|-
    node is the boolean value %1% = "true".  The logical denotation of the
+
| &nbsp;
    paint p_j is the proposition P_j, a proposition that is regarded as
+
| &nbsp;
    "primitive", at least, with respect to the level of analysis that
+
| <math>=\!</math>
    is represented in the current instance of !C!(!P!).
+
| a lobe with the accoutrements <math>C_1, \ldots, C_k.</math>
 
+
|}
2.  The logical denotation of a lobe is the logical surjunction of that lobe's
  −
    "arguments", which are defined as the logical denotations of that lobe's
  −
    accoutrements.  As a corollary, the logical denotation of the parse graph
  −
    of "-()-", otherwise called a "needle", is the boolean value %0% = "false".
  −
 
  −
If one takes the point of view that PARC's and PARCE's amount to a
  −
pair of intertranslatable languages for the same domain of objects,
  −
then the "spiny bracket" notation, as in "-[C_j]-" or "-[S_j]-",
  −
can be used on either domain of signs to indicate the logical
  −
denotation of a cactus C_j or the logical denotation of
  −
a sentence S_j, respectively.
  −
 
  −
Tables 13.1 and 13.2 summarize the relations that serve to connect the
  −
formal language of sentences with the logical language of propositions.
  −
Between these two realms of expression there is a family of graphical
  −
data structures that arise in parsing the sentences and that serve to
  −
facilitate the performance of computations on the indicator functions.
  −
The graphical language supplies an intermediate form of representation
  −
between the formal sentences and the indicator functions, and the form
  −
of mediation that it provides is very useful in rendering the possible
  −
connections between the other two languages conceivable in fact, not to
  −
mention in carrying out the necessary translations on a practical basis.
  −
These Tables include this intermediate domain in their Central Columns.
  −
Between their First and Middle Columns they illustrate the mechanics of
  −
parsing the abstract sentences of the cactus language into the graphical
  −
data structures of the corresponding species.  Between their Middle and
  −
Final Columns they summarize the semantics of interpreting the graphical
  −
forms of representation for the purposes of reasoning with propositions.
  −
 
  −
Table 13.1  Semantic Translations:  Functional Form
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   | Par |                  | Den |                  |
  −
| Sentence          | --> | Graph            | --> | Proposition      |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| S_j              | --> | C_j              | --> | Q_j              |
  −
|                   |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| Conc^0            | --> | Node^0            | --> | %1%              |
  −
|                   |    |                  |    |                  |
  −
| Conc^k_j  S_j    | --> | Node^k_j  C_j    | --> | Conj^k_j  Q_j    |
  −
|                   |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| Surc^0            | --> | Lobe^0           | --> | %0%              |
  −
|                   |    |                  |    |                  |
  −
| Surc^k_j  S_j    | --> | Lobe^k_j  C_j    | --> | Surj^k_j  Q_j    |
  −
|                   |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
 
  −
Table 13.2  Semantic Translations:  Equational Form
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                  | Par |                  | Den |                  |
  −
| -[Sentence]-      |  = | -[Graph]-        |  =  | Proposition      |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| -[S_j]-          |  =  | -[C_j]-          |  =  | Q_j              |
  −
|                  |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| -[Conc^0]-        |  =  | -[Node^0]-        |  = | %1%              |
  −
|                  |    |                  |    |                  |
  −
| -[Conc^k_j  S_j]- |  =  | -[Node^k_j  C_j]- | = | Conj^k_j  Q_j    |
  −
|                   |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
|                   |    |                  |    |                  |
  −
| -[Surc^0]-        |  =  | -[Lobe^0]-        |  =  | %0%              |
  −
|                   |    |                  |    |                  |
  −
| -[Surc^k_j  S_j]- |  = | -[Lobe^k_j  C_j]- |  =  | Surj^k_j  Q_j    |
  −
|                   |    |                  |    |                  |
  −
o-------------------o-----o-------------------o-----o-------------------o
  −
 
  −
Aside from their common topic, the two Tables present slightly different
  −
ways of conceptualizing the operations that go to establish their maps.
  −
Table 13.1 records the functional associations that connect each domain
  −
with the next, taking the triplings of a sentence S_j, a cactus C_j, and
  −
a proposition Q_j as basic data, and fixing the rest by recursion on these.
  −
Table 13.2 records these associations in the form of equations, treating
  −
sentences and graphs as alternative kinds of signs, and generalizing the
  −
spiny bracket operator to indicate the proposition that either denotes.
  −
It should be clear at this point that either scheme of translation puts
  −
the sentences, the graphs, and the propositions that it associates with
  −
each other roughly in the roles of the signs, the interpretants, and the
  −
objects, respectively, whose triples define an appropriate sign relation.
  −
Indeed, the "roughly" can be made "exactly" as soon as the domains of
  −
a suitable sign relation are specified precisely.
  −
 
  −
A good way to illustrate the action of the conjunction and surjunction
  −
operators is to demonstate how they can be used to construct all of the
  −
boolean functions on k variables, just now, let us say, for k = 0, 1, 2.
  −
 
  −
A boolean function on 0 variables is just a boolean constant F^0 in the
  −
boolean domain %B% = {%0%, %1%}.  Table 14 shows several different ways
  −
of referring to these elements, just for the sake of consistency using
  −
the same format that will be used in subsequent Tables, no matter how
  −
degenerate it tends to appears in the immediate case:
  −
 
  −
Column 1 lists each boolean element or boolean function under its
  −
ordinary constant name or under a succinct nickname, respectively.
  −
 
  −
Column 2 lists each boolean function in a style of function name "F^i_j"
  −
that is constructed as follows:  The superscript "i" gives the dimension
  −
of the functional domain, that is, the number of its functional variables,
  −
and the subscript "j" is a binary string that recapitulates the functional
  −
values, using the obvious translation of boolean values into binary values.
  −
 
  −
Column 3 lists the functional values for each boolean function, or possibly
  −
a boolean element appearing in the guise of a function, for each combination
  −
of its domain values.
  −
 
  −
Column 4 shows the usual expressions of these elements in the cactus language,
  −
conforming to the practice of omitting the strike-throughs in display formats.
  −
Here I illustrate also the useful convention of sending the expression "(())"
  −
as a visible stand-in for the expression of a constantly "true" truth value,
  −
one that would otherwise be represented by a blank expression, and tend to
  −
elude our giving it much notice in the context of more demonstrative texts.
  −
 
  −
Table 14.  Boolean Functions on Zero Variables
  −
o----------o----------o-------------------------------------------o----------o
  −
| Constant | Function |                    F()                    | Function |
  −
o----------o----------o-------------------------------------------o----------o
  −
|          |          |                                          |          |
  −
| %0%      | F^0_0    |                    %0%                    |    ()    |
  −
|          |          |                                          |          |
  −
| %1%      | F^0_1    |                    %1%                    |  (())  |
  −
|          |          |                                          |          |
  −
o----------o----------o-------------------------------------------o----------o
     −
Table 15 presents the boolean functions on one variable, F^1 : %B% -> %B%,
+
Working from a structural description of the cactus language, or any suitable formal grammar for <math>\mathfrak{C} (\mathfrak{P}),\!</math> it is possible to give a recursive definition of the function called <math>\operatorname{Parse}</math> that maps each sentence in <math>\operatorname{PARCE} (\mathfrak{P})\!</math> to the corresponding graph in <math>\operatorname{PARC} (\mathfrak{P}).\!</math>  One way to do this proceeds as follows:
of which there are precisely four.  Here, Column 1 codes the contents of
  −
Column 2 in a more concise form, compressing the lists of boolean values,
  −
recorded as bits in the subscript string, into their decimal equivalents.
  −
Naturally, the boolean constants reprise themselves in this new setting
  −
as constant functions on one variable.  Thus, one has the synonymous
  −
expressions for constant functions that are expressed in the next
  −
two chains of equations:
     −
| F^1_0  = F^1_00  =  %0% : %B% -> %B%
+
<ol style="list-style-type:decimal">
|
  −
| F^1_3  =  F^1_11  =  %1% : %B% -> %B%
     −
As for the rest, the other two functions are easily recognized as corresponding
+
<li>The parse of the concatenation <math>\operatorname{Conc}_{j=1}^k</math> of the <math>k\!</math> sentences <math>(s_j)_{j=1}^k</math> is defined recursively as follows:</li>
to the one-place logical connectives, or the monadic operators on %B%.  Thus,
  −
the function F^1_1  = F^1_01 is recognizable as the negation operation, and
  −
the function F^1_2  = F^1_10 is obviously the identity operation.
     −
Table 15.  Boolean Functions on One Variable
+
<ol style="list-style-type:lower-alpha">
o----------o----------o-------------------------------------------o----------o
  −
| Function | Function |                  F(x)                    | Function |
  −
o----------o----------o---------------------o---------------------o----------o
  −
|          |          |      F(%0%)        |      F(%1%)        |          |
  −
o----------o----------o---------------------o---------------------o----------o
  −
|          |          |                    |                    |          |
  −
| F^1_0    | F^1_00  |        %0%        |        %0%        |  ( )    |
  −
|          |          |                    |                    |          |
  −
| F^1_1    | F^1_01  |        %0%        |        %1%        |  (x)    |
  −
|          |          |                    |                    |          |
  −
| F^1_2    | F^1_10  |        %1%        |        %0%        |    x    |
  −
|          |          |                    |                    |          |
  −
| F^1_3    | F^1_11  |        %1%        |        %1%        |  (( ))  |
  −
|          |          |                    |                    |          |
  −
o----------o----------o---------------------o---------------------o----------o
     −
Table 16 presents the boolean functions on two variables, F^2 : %B%^2 -> %B%,
+
<li><math>\operatorname{Parse} (\operatorname{Conc}^0) ~=~ \operatorname{Node}^0.</math>
of which there are precisely sixteen in number.  As before, all of the boolean
  −
functions of fewer variables are subsumed in this Table, though under a set of
  −
alternative names and possibly different interpretations.  Just to acknowledge
  −
a few of the more notable pseudonyms:
     −
The constant function %0% : %B%^2 -> %B% appears under the name of F^2_00.
+
<li>
 +
<p>For <math>k > 0,\!</math></p>
   −
The constant function %1% : %B%^2 -> %B% appears under the name of F^2_15.
+
<p><math>\operatorname{Parse} (\operatorname{Conc}_{j=1}^k s_j) ~=~ \operatorname{Node}_{j=1}^k \operatorname{Parse} (s_j).</math></p></li>
   −
The negation and identity of the first variable are F^2_03 and F^2_12, resp.
+
</ol>
   −
The negation and identity of the other variable are F^2_05 and F^2_10, resp.
+
<li>The parse of the surcatenation <math>\operatorname{Surc}_{j=1}^k</math> of the <math>k\!</math> sentences <math>(s_j)_{j=1}^k</math> is defined recursively as follows:</li>
   −
The logical conjunction is given by the function F^2_08 (x, y)  = x · y.
+
<ol style="list-style-type:lower-alpha">
   −
The logical disjunction is given by the function F^2_14 (x, y) = ((x)(y)).
+
<li><math>\operatorname{Parse} (\operatorname{Surc}^0) ~=~ \operatorname{Lobe}^0.</math>
   −
Functions expressing the "conditionals", "implications",
+
<li>
or "if-then" statements are given in the following ways:
+
<p>For <math>k > 0,\!</math></p>
   −
[x => y]  = F^2_11 (x, y) = (x (y))  =  [not x without y].
+
<p><math>\operatorname{Parse} (\operatorname{Surc}_{j=1}^k s_j) ~=~ \operatorname{Lobe}_{j=1}^k \operatorname{Parse} (s_j).</math></p></li>
   −
[x <= y]  =  F^2_13 (x, y)  =  ((x) y)  =  [not y without x].
+
</ol></ol>
   −
The function that corresponds to the "biconditional",
+
For ease of reference, Table&nbsp;13 summarizes the mechanics of these parsing rules.
the "equivalence", or the "if and only" statement is
  −
exhibited in the following fashion:
     −
[x <=> y]  =  [x = y]  =  F^2_09 (x, y)  =  ((x , y)).
+
<br>
   −
Finally, there is a boolean function that is logically associated with
+
{| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width:80%"
the "exclusive disjunction", "inequivalence", or "not equals" statement,
+
|+ style="height:30px" | <math>\text{Table 13.} ~~ \text{Algorithmic Translation Rules}\!</math>
algebraically associated with the "binary sum" or "bitsum" operation,
+
|- style="height:40px; background:ghostwhite"
and geometrically associated with the "symmetric difference" of sets.
  −
This function is given by:
  −
 
  −
[x =/= y]  =  [x + y]  =  F^2_06 (x, y)  =  (x , y).
  −
 
  −
Table 16.  Boolean Functions on Two Variables
  −
o----------o----------o-------------------------------------------o----------o
  −
| Function | Function |                  F(x, y)                  | Function |
  −
o----------o----------o----------o----------o----------o----------o----------o
  −
|          |          | %1%, %1% | %1%, %0% | %0%, %1% | %0%, %0% |          |
  −
o----------o----------o----------o----------o----------o----------o----------o
  −
|         |          |          |          |          |          |          |
  −
| F^2_00  | F^2_0000 |  %0%    |  %0%    |  %0%    |  %0%    |    ()    |
  −
|          |          |          |          |          |          |          |
  −
| F^2_01  | F^2_0001 |  %0%    |  %0%    |  %0%    |  %1%    |  (x)(y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_02  | F^2_0010 |  %0%    |  %0%    |  %1%    |  %0%    |  (x) y  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_03  | F^2_0011 |  %0%    |  %0%    |  %1%    |  %1%    |  (x)    |
  −
|          |          |          |          |          |          |          |
  −
| F^2_04  | F^2_0100 |  %0%    |  %1%    |  %0%    |  %0%    |  x (y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_05  | F^2_0101 |  %0%    |  %1%    |  %0%    |  %1%    |    (y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_06  | F^2_0110 |  %0%    |  %1%    |  %1%    |  %0%    |  (x, y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_07  | F^2_0111 |  %0%    |  %1%    |  %1%    |  %1%    |  (x  y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_08  | F^2_1000 |  %1%    |  %0%    |  %0%    |  %0%    |  x  y  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_09  | F^2_1001 |  %1%    |  %0%    |  %0%    |  %1%    | ((x, y)) |
  −
|          |          |          |          |          |          |          |
  −
| F^2_10  | F^2_1010 |  %1%    |  %0%    |  %1%    |  %0%    |      y  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_11  | F^2_1011 |  %1%    |  %0%    |  %1%    |  %1%    |  (x (y)) |
  −
|          |          |          |          |          |          |          |
  −
| F^2_12  | F^2_1100 |  %1%    |  %1%    |  %0%    |  %0%    |  x      |
  −
|          |          |          |          |          |          |          |
  −
| F^2_13  | F^2_1101 |  %1%    |  %1%    |  %0%    |  %1%    | ((x) y)  |
  −
|          |          |          |          |          |          |          |
  −
| F^2_14  | F^2_1110 |  %1%    |  %1%    |  %1%    |  %0%    | ((x)(y)) |
  −
|          |          |          |          |          |          |          |
  −
| F^2_15  | F^2_1111 |  %1%    |  %1%    |  %1%    |  %1%    |  (())  |
  −
|          |          |          |          |          |          |          |
  −
o----------o----------o----------o----------o----------o----------o----------o
  −
 
  −
Let me now address one last question that may have occurred to some.
  −
What has happened, in this suggested scheme of functional reasoning,
  −
to the distinction that is quite pointedly made by careful logicians
  −
between (1) the connectives called "conditionals" and symbolized by
  −
the signs "->" and "<-", and (2) the assertions called "implications"
  −
and symbolized by the signs "=>" and "<=", and, in a related question:
  −
What has happened to the distinction that is equally insistently made
  −
between (3) the connective called the "biconditional" and signified by
  −
the sign "<->" and (4) the assertion that is called an "equivalence"
  −
and signified by the sign "<=>"?  My answer is this:  For my part,
  −
I am deliberately avoiding making these distinctions at the level
  −
of syntax, preferring to treat them instead as distinctions in
  −
the use of boolean functions, turning on whether the function
  −
is mentioned directly and used to compute values on arguments,
  −
or whether its inverse is being invoked to indicate the fibers
  −
of truth or untruth under the propositional function in question.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
In this Subsection, I finally bring together many of what may
  −
have appeared to be wholly independent threads of development,
  −
in the hope of paying off a percentage of my promissory notes,
  −
even if a goodly number my creditors have no doubt long since
  −
forgotten, if not exactly forgiven the debentures in question.
  −
 
  −
For ease of reference, I repeat here a couple of the
  −
definitions that are needed again in this discussion.
  −
 
  −
| A "boolean connection" of degree k, also known as a "boolean function"
  −
| on k variables, is a map of the form F : %B%^k -> %B%.  In other words,
  −
| a boolean connection of degree k is a proposition about things in the
  −
| universe of discourse X = %B%^k.
   
|
 
|
| An "imagination" of degree k on X is a k-tuple of propositions
+
{| align="center" border="0" cellpadding="8" cellspacing="0" style="background:ghostwhite; text-align:center; width:100%"
| about things in the universe X.  By way of displaying the kinds
+
| width="33%" | <math>\text{Sentence in PARCE}\!</math>
| of notation that are used to express this idea, the imagination
+
| width="33%" | <math>\xrightarrow{\mathrm{Parse}}\!</math>
| #f# = <f_1, ..., f_k> is can be given as a sequence of indicator
+
| width="33%" | <math>\text{Graph in PARC}\!</math>
| functions f_j : X -> %B%, for j = 1 to k.  All of these features
+
|}
| of the typical imagination #f# can be summed up in either one of
+
|-
| two ways: either in the form of a membership statement, stating
+
|
| words to the effect that #f# belongs to the space (X -> %B%)^k,
+
{| align="center" border="0" cellpadding="8" cellspacing="0" style="text-align:center; width:100%"
| or in the form of the type declaration that #f# : (X -> %B%)^k,
+
| width="33%" | <math>\mathrm{Conc}^0\!</math>
| though perhaps the latter specification is slightly more precise
+
| width="33%" | <math>\xrightarrow{\mathrm{Parse}}\!</math>
| than the former.
+
| width="33%" | <math>\mathrm{Node}^0\!</math>
 +
|-
 +
| width="33%" | <math>\mathrm{Conc}_{j=1}^k s_j\!</math>
 +
| width="33%" | <math>\xrightarrow{\mathrm{Parse}}\!</math>
 +
| width="33%" | <math>\mathrm{Node}_{j=1}^k \mathrm{Parse} (s_j)\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" style="text-align:center; width:100%"
 +
| width="33%" | <math>\mathrm{Surc}^0\!</math>
 +
| width="33%" | <math>\xrightarrow{\mathrm{Parse}}\!</math>
 +
| width="33%" | <math>\mathrm{Lobe}^0\!</math>
 +
|-
 +
| width="33%" | <math>\mathrm{Surc}_{j=1}^k s_j\!</math>
 +
| width="33%" | <math>\xrightarrow{\mathrm{Parse}}\!</math>
 +
| width="33%" | <math>\mathrm{Lobe}_{j=1}^k \mathrm{Parse} (s_j)\!</math>
 +
|}
 +
|}
   −
The definition of the "stretch" operation and the uses of the
+
<br>
various brands of denotational operators can be reviewed here:
     −
055http://suo.ieee.org/email/msg07466.html
+
A ''substructure'' of a PARC is defined recursively as followsStarting at the root node of the cactus <math>C,\!</math> any attachment is a substructure of <math>C.\!</math>  If a substructure is a blank or a paint, then it constitutes a minimal substructure, meaning that no further substructures of <math>C\!</math> arise from itIf a substructure is a lobe, then each one of its accoutrements is also a substructure of <math>C,\!</math> and has to be examined for further substructures.
057http://suo.ieee.org/email/msg07469.html
     −
070http://suo.ieee.org/ontology/msg03473.html
+
The concept of substructure can be used to define varieties of deletion and erasure operations that respect the structure of the abstract graphFor the purposes of this depiction, a blank symbol <math>^{\backprime\backprime} ~ ^{\prime\prime}</math> is treated as a ''primer'', in other words, as a ''clear paint'' or a ''neutral tint''In effect, one is letting <math>m_1 = p_0.\!</math> In this frame of discussion, it is useful to make the following distinction:
071http://suo.ieee.org/ontology/msg03479.html
  −
</pre>
     −
==Stretching Exercises==
+
# To ''delete'' a substructure is to replace it with an empty node, in effect, to reduce the whole structure to a trivial point.
 +
# To ''erase'' a substructure is to replace it with a blank symbol, in effect, to paint it out of the picture or to overwrite it.
   −
<pre>
+
A ''bare PARC'', loosely referred to as a ''bare cactus'', is a PARC on the empty palette <math>\mathfrak{P} = \varnothing.</math> In other veins, a bare cactus can be described in several different ways, depending on how the form arises in practice.
Taking up the preceding arrays of particular connections, namely,
  −
the boolean functions on two or less variables, it possible to
  −
illustrate the use of the stretch operation in a variety of
  −
concrete cases.
     −
For example, suppose that F is a connection of the form F : %B%^2 -> %B%,
+
<ol style="list-style-type:decimal">
that is, any one of the sixteen possibilities in Table 16, while p and q
  −
are propositions of the form p, q : X -> %B%, that is, propositions about
  −
things in the universe X, or else the indicators of sets contained in X.
     −
Then one has the imagination #f# = <f_1, f_2> = <p, q> : (X -> %B%)^2,
+
<li>Leaning on the definition of a bare PARCE, a bare PARC can be described as the kind of a parse graph that results from parsing a bare cactus expression, in other words, as the kind of a graph that issues from the requirements of processing a sentence of the bare cactus language <math>\mathfrak{C}^0 = \operatorname{PARCE}^0.</math></li>
and the stretch of the connection F to #f# on X amounts to a proposition
  −
F^$ <p, q> : X -> %B%, usually written as "F^$ (p, q)" and vocalized as
  −
the "stretch of F to p and q".  If one is concerned with many different
  −
propositions about things in X, or if one is abstractly indifferent to
  −
the particular choices for p and q, then one can detach the operator
  −
F^$ : (X -> %B%)^2 -> (X -> %B%), called the "stretch of F over X",
  −
and consider it in isolation from any concrete application.
     −
When the "cactus notation" is used to represent boolean functions,
+
<li>To express it more in its own terms, a bare PARC can be defined by tracing the recursive definition of a generic PARC, but then by detaching an independent form of description from the source of that analogy. The method is sufficiently sketched as follows:</li>
a single "$" sign at the end of the expression is enough to remind
  −
a reader that the connections are meant to be stretched to several
  −
propositions on a universe X.
     −
For instance, take the connection F : %B%^2 -> %B% such that:
+
<ol style="list-style-type:lower-latin">
   −
F(x, y)  =  F^2_06 (x, y)  =  -(x, y)-.
+
<li>A ''bare PARC'' is a PARC whose attachments are limited to blanks and ''bare lobes''.</li>
   −
This connection is the boolean function on a couple of variables x, y
+
<li>A ''bare lobe'' is a lobe whose accoutrements are limited to bare PARC's.</li>
that yields a value of %1% if and only if just one of x, y is not %1%,
  −
that is, if and only if just one of x, y is %1%.  There is clearly an
  −
isomorphism between this connection, viewed as an operation on the
  −
boolean domain %B% = {%0%, %1%}, and the dyadic operation on binary
  −
values x, y in !B! = GF(2) that is otherwise known as "x + y".
     −
The same connection F : %B%^2 -> %B% can also be read as a proposition
+
</ol>
about things in the universe X = %B%^2.  If S is a sentence that denotes
  −
the proposition F, then the corresponding assertion says exactly what one
  −
otherwise states by uttering "x is not equal to y".  In such a case, one
  −
has -[S]- = F, and all of the following expressions are ordinarily taken
  −
as equivalent descriptions of the same set:
     −
[| -[S]- |] =  [| F |]
+
<li>In practice, a bare cactus is usually encountered in the process of analyzing or handling an arbitrary PARC, the circumstances of which frequently call for deleting or erasing all of its paints. In particular, this generally makes it easier to observe the various properties of its underlying graphical structure.</li>
   −
            =  F^(-1)(%1%)
+
</ol>
   −
            = {<x, y> in %B%^2  : S}
+
===The Cactus Language : Semantics===
   −
            {<x, y> in %B%^2 F(x, y) = %1%}
+
{| align="center" cellpadding="0" cellspacing="0" width="90%"
 +
|
 +
<p>Alas, and yet what ''are'' you, my written and painted thoughts! It is not long ago that you were still so many-coloured, young and malicious, so full of thorns and hidden spices you made me sneeze and laugh &mdash; and now? You have already taken off your novelty and some of you, I fear, are on the point of becoming truthsthey already look so immortal, so pathetically righteous, so boring!</p>
 +
|-
 +
| align="right" | &mdash; Nietzsche, ''Beyond Good and Evil'', [Nie-2, ¶ 296]
 +
|}
   −
            = {<x, y> in %B%^2 :  F(x, y)}
+
In this Subsection, I describe a particular semantics for the painted cactus language, telling what meanings I aim to attach to its bare syntactic forms. This supplies an ''interpretation'' for this parametric family of formal languages, but it is good to remember that it forms just one of many such interpretations that are conceivable and even viable. In deed, the distinction between the object domain and the sign domain can be observed in the fact that many languages can be deployed to depict the same set of objects and that any language worth its salt is bound to to give rise to many different forms of interpretive saliency.
   −
            =  {<x, y> in %B%^2 :  -(x, y)- = %1%}
+
In formal settings, it is common to speak of interpretation as if it created a direct connection between the signs of a formal language and the objects of the intended domain, in other words, as if it determined the denotative component of a sign relation. But a closer attention to what goes on reveals that the process of interpretation is more indirect, that what it does is to provide each sign of a prospectively meaningful source language with a translation into an already established target language, where ''already established'' means that its relationship to pragmatic objects is taken for granted at the moment in question.
   −
            =  {<x, y> in %B%^2  : -(x, y)- }
+
With this in mind, it is clear that interpretation is an affair of signs that at best respects the objects of all of the signs that enter into it, and so it is the connotative aspect of semiotics that is at stake here. There is nothing wrong with my saying that I interpret a sentence of a formal language as a sign that refers to a function or to a proposition, so long as you understand that this reference is likely to be achieved by way of more familiar and perhaps less formal signs that you already take to denote those objects.
   −
            = {<x, y> in %B%^2  :  x exclusive-or y}
+
On entering a context where a logical interpretation is intended for the sentences of a formal language there are a few conventions that make it easier to make the translation from abstract syntactic forms to their intended semantic senses. Although these conventions are expressed in unnecessarily colorful terms, from a purely abstract point of view, they do provide a useful array of connotations that help to negotiate what is otherwise a difficult transition.  This terminology is introduced as the need for it arises in the process of interpreting the cactus language.
   −
            = {<x, y> in %B%^2  : just one true of x, y}
+
The task of this Subsection is to specify a ''semantic function'' for the sentences of the cactus language <math>\mathfrak{L} = \mathfrak{C}(\mathfrak{P}),</math> in other words, to define a mapping that "interprets" each sentence of <math>\mathfrak{C}(\mathfrak{P})</math> as a sentence that says something, as a sentence that bears a meaning, in short, as a sentence that denotes a proposition, and thus as a sign of an indicator function. When the syntactic sentences of a formal language are given a referent significance in logical terms, for example, as denoting propositions or indicator functions, then each form of syntactic combination takes on a corresponding form of logical significance.
   −
            = {<x, y> in %B%^2  :  x not equal to y}
+
By way of providing a logical interpretation for the cactus language, I introduce a family of operators on indicator functions that are called ''propositional connectives'', and I distinguish these from the associated family of syntactic combinations that are called ''sentential connectives'', where the relationship between these two realms of connection is exactly that between objects and their signs. A propositional connective, as an entity of a well-defined functional and operational type, can be treated in every way as a logical or a mathematical object, and thus as the type of object that can be denoted by the corresponding form of syntactic entity, namely, the sentential connective that is appropriate to the case in question.
   −
            = {<x, y> in %B%^2 :  x <=/=> y}
+
There are two basic types of connectives, called the ''blank connectives'' and the ''bound connectives'', respectively, with one connective of each type for each natural number <math>k = 0, 1, 2, 3, \ldots.</math>
   −
            = {<x, y> in %B%^2  :  x =/= y}
+
<ol style="list-style-type:decimal">
   −
            =  {<x, y> in %B%^2  :  x + y}.
+
<li>
 +
<p>The ''blank connective'' of <math>k\!</math> places is signified by the concatenation of the <math>k\!</math> sentences that fill those places.</p>
   −
Notice the slight distinction, that I continue to maintain at this point,
+
<p>For the special case of <math>k = 0,\!</math> the blank connective is taken to be an empty string or a blank symbol &mdash; it does not matter which, since both are assigned the same denotation among propositions.</p>
between the logical values {false, true} and the algebraic values {0, 1}.
  −
This makes it legitimate to write a sentence directly into the right side
  −
of the set-builder expression, for instance, weaving the sentence S or the
  −
sentence "x is not equal to y" into the context "{<x, y> in %B%^2 : ... }",
  −
thereby obtaining the corresponding expressions listed above, while the
  −
proposition F(x, y) can also be asserted more directly without equating
  −
it to %1%, since it already has a value in {false, true}, and thus can
  −
be taken as tantamount to an actual sentence.
     −
If the appropriate safeguards can be kept in mind, avoiding all danger of
+
<p>For the generic case of <math>k > 0,\!</math> the blank connective takes the form <math>s_1 \cdot \ldots \cdot s_k.</math>  In the type of data that is called a ''text'', the use of the center dot <math>(\cdot)</math> is generally supplanted by whatever number of spaces and line breaks serve to improve the readability of the resulting text.</p></li>
confusing propositions with sentences and sentences with assertions, then
  −
the marks of these distinctions need not be forced to clutter the account
  −
of the more substantive indications, that is, the ones that really matter.
  −
If this level of understanding can be achieved, then it may be possible
  −
to relax these restrictions, along with the absolute dichotomy between
  −
algebraic and logical values, which tends to inhibit the flexibility
  −
of interpretation.
     −
This covers the properties of the connection F(x, y) = -(x, y)-,
+
<li>
treated as a proposition about things in the universe X = %B%^2.
+
<p>The ''bound connective'' of <math>k\!</math> places is signified by the surcatenation of the <math>k\!</math> sentences that fill those places.</p>
Staying with this same connection, it is time to demonstrate how
  −
it can be "stretched" into an operator on arbitrary propositions.
     −
To continue the exercise, let p and q be arbitrary propositions about
+
<p>For the special case of <math>k = 0,\!</math> the bound connective is taken to be an empty closure &mdash; an expression enjoying one of the forms <math>\underline{(} \underline{)}, \, \underline{(} ~ \underline{)}, \, \underline{(} ~~ \underline{)}, \, \ldots</math> with any number of blank symbols between the parentheses &mdash; all of which are assigned the same logical denotation among propositions.</p>
things in the universe X, that is, maps of the form p, q : X -> %B%,
  −
and suppose that p, q are indicator functions of the sets P, Q c X,
  −
respectively.  In other words, one has the following set of data:
     −
p     =        -{P}-        :  X -> %B%
+
<p>For the generic case of <math>k > 0,\!</math> the bound connective takes the form <math>\underline{(} s_1, \ldots, s_k \underline{)}.</math></p></li>
|
  −
|  q    =        -{Q}-        :  X -> %B%
  −
|
  −
| <p, q> < -{P}- , -{Q}- > :  (X -> %B%)^2
     −
Then one has an operator F^$, the stretch of the connection F over X,
+
</ol>
and a proposition F^$ (p, q), the stretch of F to <p, q> on X, with
  −
the following properties:
     −
| F^$        =  -( , )-^$  :  (X -> %B%)^2 -> (X -> %B%)
+
At this point, there are actually two different dialects, scripts, or modes of presentation for the cactus language that need to be interpreted, in other words, that need to have a semantic function defined on their domains.
|
  −
| F^$ (p, q)  =  -(p, q)-^$  :  X -> %B%
     −
As a result, the application of the proposition F^$ (p, q) to each x in X
+
<ol style="list-style-type:lower-alpha">
yields a logical value in %B%, all in accord with the following equations:
     −
| F^$ (p, q)(x)  =   -(p, q)-^$ (x) in  %B%
+
<li>There is the literal formal language of strings in <math>\operatorname{PARCE} (\mathfrak{P}),</math> the ''painted and rooted cactus expressions'' that constitute the language <math>\mathfrak{L} = \mathfrak{C} (\mathfrak{P}) \subseteq \mathfrak{A}^* = (\mathfrak{M} \cup \mathfrak{P})^*.</math></li>
|
  −
|  ^                        ^
  −
|  |                        |
  −
|  =                        =
  −
|  |                        |
  −
|  v                        v
  −
|
  −
| F(p(x), q(x))  =  -(p(x), q(x))-  in  %B%
     −
For each choice of propositions p and q about things in X, the stretch of
+
<li>There is the figurative formal language of graphs in <math>\operatorname{PARC} (\mathfrak{P}),</math> the ''painted and rooted cacti'' themselves, a parametric family of graphs or a species of computational data structures that is graphically analogous to the language of literal strings.</li>
F to p and q on X is just another proposition about things in X, a simple
  −
proposition in its own right, no matter how complex its current expression
  −
or its present construction as F^$ (p, q) = -(p, q)^$ makes it appear in
  −
relation to p and q.  Like any other proposition about things in X, it
  −
indicates a subset of X, namely, the fiber that is variously described
  −
in the following ways:
     −
[| F^$ (p, q) |]  =  [| -(p, q)-^$ |]
+
</ol>
   −
                  = (F^$ (p, q))^(-1)(%1%)
+
Of course, these two modalities of formal language, like written and spoken natural languages, are meant to have compatible interpretations, and so it is usually sufficient to give just the meanings of either one.  All that remains is to provide a ''codomain'' or a ''target space'' for the intended semantic function, in other words, to supply a suitable range of logical meanings for the memberships of these languages to map into. Out of the many interpretations that are formally possible to arrange, one way of doing this proceeds by making the following definitions:
   −
                  = {x in X  : F^$ (p, q)(x)}
+
<ol style="list-style-type:decimal">
   −
                  =  {x in X  :  -(p, q)-^$ (x)}
+
<li>
 +
<p>The ''conjunction'' <math>\operatorname{Conj}_j^J q_j</math> of a set of propositions, <math>\{ q_j : j \in J \},</math> is a proposition that is true if and only if every one of the <math>q_j\!</math> is true.</p>
   −
                  =  {x in X  :  -(p(x), q(x))- }
+
<p><math>\operatorname{Conj}_j^J q_j</math> is true &nbsp;<math>\Leftrightarrow</math>&nbsp; <math>q_j\!</math> is true for every <math>j \in J.</math></p></li>
   −
                  =  {x in X  :  p(x) ± q(x)}
+
<li>
 +
<p>The ''surjunction'' <math>\operatorname{Surj}_j^J q_j</math> of a set of propositions, <math>\{ q_j : j \in J \},</math> is a proposition that is true if and only if exactly one of the <math>q_j\!</math> is untrue.</p>
   −
                  =  {x in X  :  p(x) =/= q(x)}
+
<p><math>\operatorname{Surj}_j^J q_j</math> is true &nbsp;<math>\Leftrightarrow</math>&nbsp;  <math>q_j\!</math> is untrue for unique <math>j \in J.</math></p></li>
   −
                  =  {x in X  :  -{P}- (x) =/= -{Q}- (x)}
+
</ol>
   −
                  =  {x in X  :  x in P <=/=> x in Q}
+
If the number of propositions that are being joined together is finite, then the conjunction and the surjunction can be represented by means of sentential connectives, incorporating the sentences that represent these propositions into finite strings of symbols.
   −
                  = {x in X  : x in P-Q or x in Q-P}
+
If <math>J\!</math> is finite, for instance, if <math>J\!</math> consists of the integers in the interval <math>j = 1 ~\text{to}~ k,</math> and if each proposition <math>q_j\!</math> is represented by a sentence <math>s_j,\!</math> then the following strategies of expression are open:
   −
                  = {x in X  :  x in P-Q |_| Q-P}
+
<ol style="list-style-type:decimal">
   −
                  =  {x in : x in P ± Q}
+
<li>
 +
<p>The conjunction <math>\operatorname{Conj}_j^J q_j</math> can be represented by a sentence that is constructed by concatenating the <math>s_j\!</math> in the following fashion:</p>
   −
                  =  P ± Q          c  X
+
<p><math>\operatorname{Conj}_j^J q_j ~\leftrightsquigarrow~ s_1 s_2 \ldots s_k.</math></p></li>
   −
                  =  [|p|] ± [|q|]  c  X.
+
<li>
 +
<p>The surjunction <math>\operatorname{Surj}_j^J q_j</math> can be represented by a sentence that is constructed by surcatenating the <math>s_j\!</math> in the following fashion:</p>
   −
Which was to be shown.
+
<p><math>\operatorname{Surj}_j^J q_j ~\leftrightsquigarrow~ \underline{(} s_1, s_2, \ldots, s_k \underline{)}.</math></p></li>
</pre>
     −
==References==
+
</ol>
   −
* Bernstein, Herbert J. (1987), "Idols of Modern Science and The Reconstruction of Knowledge", pp. 37-68 in Marcus G. Raskin and Herbert J. Bernstein, ''New Ways of Knowing : The Sciences, Society, and Reconstructive Knowledge', Rowman and Littlefield, Totowa, NJ, 1987.
+
If one opts for a mode of interpretation that moves more directly from the parse graph of a sentence to the potential logical meaning of both the PARC and the PARCE, then the following specifications are in order:
   −
* Nietzsche, Friedrich, ''Beyond Good and Evil : Prelude to a Philosophy of the Future'', R.J. Hollingdale (trans.), Michael Tanner (intro.), Penguin Books, London, UK, 1973, 1990.
+
A cactus rooted at a particular node is taken to represent what that node denotes, its logical denotation or its logical interpretation.
   −
* Raskin, Marcus G., and Bernstein, Herbert J. (1987, eds.), ''New Ways of Knowing : The Sciences, Society, and Reconstructive Knowledge', Rowman and Littlefield, Totowa, NJ, 1987.
+
# The logical denotation of a node is the logical conjunction of that node's arguments, which are defined as the logical denotations of that node's attachments. The logical denotation of either a blank symbol or an empty node is the boolean value <math>\underline{1} = \operatorname{true}.</math>  The logical denotation of the paint <math>\mathfrak{p}_j\!</math> is the proposition <math>p_j,\!</math> a proposition that is regarded as ''primitive'', at least, with respect to the level of analysis that is represented in the current instance of <math>\mathfrak{C} (\mathfrak{P}).</math>
 +
# The logical denotation of a lobe is the logical surjunction of that lobe's arguments, which are defined as the logical denotations of that lobe's accoutrements. As a corollary, the logical denotation of the parse graph of <math>\underline{(} \underline{)},</math> otherwise called a ''needle'', is the boolean value <math>\underline{0} = \operatorname{false}.</math>
   −
==Document History==
+
If one takes the point of view that PARCs and PARCEs amount to a pair of intertranslatable languages for the same domain of objects, then denotation brackets of the form <math>\downharpoonleft \ldots \downharpoonright</math> can be used to indicate the logical denotation <math>\downharpoonleft C_j \downharpoonright</math> of a cactus <math>C_j\!</math> or the logical denotation <math>\downharpoonleft s_j \downharpoonright</math> of a sentence <math>s_j.\!</math>
   −
<pre>
+
Tables&nbsp;14 and 15 summarize the relations that serve to connect the formal language of sentences with the logical language of propositionsBetween these two realms of expression there is a family of graphical data structures that arise in parsing the sentences and that serve to facilitate the performance of computations on the indicator functionsThe graphical language supplies an intermediate form of representation between the formal sentences and the indicator functions, and the form of mediation that it provides is very useful in rendering the possible connections between the other two languages conceivable in fact, not to mention in carrying out the necessary translations on a practical basis. These Tables include this intermediate domain in their Central Columns. Between their First and Middle Columns they illustrate the mechanics of parsing the abstract sentences of the cactus language into the graphical data structures of the corresponding speciesBetween their Middle and Final Columns they summarize the semantics of interpreting the graphical forms of representation for the purposes of reasoning with propositions.
| Subject:  Inquiry Driven Systems : An Inquiry Into Inquiry
  −
| Contact:  Jon Awbrey <jawbrey@oakland.edu>
  −
| Version:  Draft 8.70
  −
| Created:  23 Jun 1996
  −
| Revised:  06 Jan 2002
  −
| Advisor: M.A. Zohdy
  −
| Setting: Oakland University, Rochester, Michigan, USA
  −
| Excerpt: Section 1.3.10 (Recurring Themes)
  −
| Excerpt: Subsections 1.3.10.8 - 1.3.10.13
  −
</pre>
     −
==Notes Found in a Cactus Patch==
+
<br>
   −
===Cactus Language===
+
{| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width:80%"
 +
|+ style="height:30px" | <math>\text{Table 14.} ~~ \text{Semantic Translation : Functional Form}\!</math>
 +
|- style="height:40px; background:ghostwhite"
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" style="background:ghostwhite; width:100%"
 +
| width="20%" | <math>\mathrm{Sentence}\!</math>
 +
| width="20%" | <math>\xrightarrow[\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}]{\mathrm{Parse}}\!</math>
 +
| width="20%" | <math>\mathrm{Graph}\!</math>
 +
| width="20%" | <math>\xrightarrow[\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}]{\mathrm{Denotation}}\!</math>
 +
| width="20%" | <math>\mathrm{Proposition}\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>s_j\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>C_j\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>q_j\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>\mathrm{Conc}^0\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Node}^0\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\underline{1}\!</math>
 +
|-
 +
| width="20%" | <math>\mathrm{Conc}^k_j s_j\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Node}^k_j C_j\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Conj}^k_j q_j\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>\mathrm{Surc}^0\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Lobe}^0\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\underline{0}\!</math>
 +
|-
 +
| width="20%" | <math>\mathrm{Surc}^k_j s_j~\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Lobe}^k_j C_j\!</math>
 +
| width="20%" | <math>\xrightarrow{\mathrm{20:44, 2 August 2017 (UTC)20:44, 2 August 2017 (UTC)}}\!</math>
 +
| width="20%" | <math>\mathrm{Surj}^k_j q_j\!</math>
 +
|}
 +
|}
   −
<pre>
+
<br>
Table 13 illustrates the "existential interpretation"
  −
of cactus graphs and cactus expressions by providing
  −
English translations for a few of the most basic and
  −
commonly occurring forms.
     −
Even though I do most of my thinking in the existential interpretation,
+
{| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width:80%"
I will continue to speak of these forms as "logical graphs", because
+
|+ style="height:30px" | <math>\text{Table 15.} ~~ \text{Semantic Translation : Equational Form}\!</math>
I think it is an important fact about them that the formal validity
+
|- style="height:40px; background:ghostwhite"
of the axioms and theorems is not dependent on the choice between
+
|
the entitative and the existential interpretations.
+
{| align="center" border="0" cellpadding="8" cellspacing="0" style="background:ghostwhite; width:100%"
 +
| width="20%" | <math>\downharpoonleft \mathrm{Sentence} \downharpoonright\!</math>
 +
| width="20%" | <math>\stackrel{\mathrm{Parse}}{=}\!</math>
 +
| width="20%" | <math>\downharpoonleft \mathrm{Graph} \downharpoonright\!</math>
 +
| width="20%" | <math>\stackrel{\mathrm{Denotation}}{=}\!</math>
 +
| width="20%" | <math>\mathrm{Proposition}\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>\downharpoonleft s_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\downharpoonleft C_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>q_j\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>\downharpoonleft \mathrm{Conc}^0 \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\downharpoonleft \mathrm{Node}^0 \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\underline{1}\!</math>
 +
|-
 +
| width="20%" | <math>\downharpoonleft \mathrm{Conc}^k_j s_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\downharpoonleft \mathrm{Node}^k_j C_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\mathrm{Conj}^k_j q_j\!</math>
 +
|}
 +
|-
 +
|
 +
{| align="center" border="0" cellpadding="8" cellspacing="0" width="100%"
 +
| width="20%" | <math>\downharpoonleft \mathrm{Surc}^0 \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\downharpoonleft \mathrm{Lobe}^0 \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\underline{0}\!</math>
 +
|-
 +
| width="20%" | <math>\downharpoonleft \mathrm{Surc}^k_j s_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\downharpoonleft \mathrm{Lobe}^k_j C_j \downharpoonright\!</math>
 +
| width="20%" | <math>=\!</math>
 +
| width="20%" | <math>\mathrm{Surj}^k_j q_j\!</math>
 +
|}
 +
|}
   −
The first extension is the "reflective extension of logical graphs" (RefLog).
+
<br>
It is obtained by generalizing the negation operator "(_)" in a certain way,
  −
calling "(_)" the "controlled", "moderated", or "reflective" negation operator
  −
of order 1, then adding another such operator for each finite k = 2, 3, ... .
  −
In sum, these operators are symbolized by bracketed argument lists as follows:
  −
"(_)", "(_,_)", "(_,_,_)", ..., where the number of slots is the order of the
  −
reflective negation operator in question.
  −
             
  −
The cactus graph and the cactus expression
  −
shown here are both described as a "spike".
     −
o---------------------------------------o
+
Aside from their common topic, the two Tables present slightly different ways of conceptualizing the operations that go to establish their maps.  Table&nbsp;14 records the functional associations that connect each domain with the next, taking the triplings of a sentence <math>s_j,\!</math> a cactus <math>C_j,\!</math> and a proposition <math>q_j\!</math> as basic data, and fixing the rest by recursion on these.  Table&nbsp;15 records these associations in the form of equations, treating sentences and graphs as alternative kinds of signs, and generalizing the denotation bracket operator to indicate the proposition that either denotes.  It should be clear at this point that either scheme of translation puts the sentences, the graphs, and the propositions that it associates with each other roughly in the roles of the signs, the interpretants, and the objects, respectively, whose triples define an appropriate sign relation.  Indeed, the "roughly" can be made "exactly" as soon as the domains of a suitable sign relation are specified precisely.
|                                      |
  −
|                  o                  |
  −
|                  |                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|                  ( )                  |
  −
o---------------------------------------o
     −
The rule of reduction for a lobe is:
+
A good way to illustrate the action of the conjunction and surjunction operators is to demonstrate how they can be used to construct the boolean functions on any finite number of variables.  Let us begin by doing this for the first three cases, <math>k = 0, 1, 2.\!</math>
   −
    x_1  x_2  ...  x_k
+
A boolean function <math>F^{(0)}\!</math> on <math>0\!</math> variables is just an element of the boolean domain <math>\underline\mathbb{B} = \{ \underline{0}, \underline{1} \}.</math>  Table&nbsp;16 shows several different ways of referring to these elements, just for the sake of consistency using the same format that will be used in subsequent Tables, no matter how degenerate it tends to appear in the initial case.
    o-----o--- ... ---o
  −
      \               /
  −
      \             /
  −
        \           /
  −
        \         /
  −
          \       /
  −
          \     /
  −
            \   /
  −
            \ /
  −
              @            =      @
     −
if and only if exactly one of the x_j is a spike.
+
<br>
   −
In Ref Log, an expression of the form "(( e_1 ),( e_2 ),( ... ),( e_k ))"
+
{| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width:80%"
expresses the fact that "exactly one of the e_j is true, for j = 1 to k".
+
|+ style="height:30px" | <math>\text{Table 16.} ~~ \text{Boolean Functions on Zero Variables}\!</math>
Expressions of this form are called "universal partition" expressions, and
+
|- style="height:40px; background:ghostwhite"
they parse into a type of graph called a "painted and rooted cactus" (PARC):
+
| width="14%" | <math>F\!</math>
 +
| width="14%" | <math>F\!</math>
 +
| width="48%" | <math>F()\!</math>
 +
| width="24%" | <math>F\!</math>
 +
|-
 +
| <math>\underline{0}\!</math>
 +
| <math>F_0^{(0)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{(~)}\!</math>
 +
|-
 +
| <math>\underline{1}\!</math>
 +
| <math>F_1^{(0)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{((~))}\!</math>
 +
|}
   −
    e_1  e_2  ...  e_k
+
<br>
    o    o          o
  −
    |    |          |
  −
    o-----o--- ... ---o
  −
      \              /
  −
      \            /
  −
        \          /
  −
        \        /
  −
          \      /
  −
          \    /
  −
            \  /
  −
            \ /
  −
              @
      +
Column&nbsp;1 lists each boolean element or boolean function under its ordinary constant name or under a succinct nickname, respectively.
   −
| ( x1, x2, ..., xk =  [blank]
+
Column&nbsp;2 lists each boolean function in a style of function name <math>F_j^{(k)}\!</math> that is constructed as follows: The superscript <math>(k)\!</math> gives the dimension of the functional domain, that is, the number of its functional variables, and the subscript <math>j\!</math> is a binary string that recapitulates the functional values, using the obvious translation of boolean values into binary values.
|
  −
| iff
  −
|
  −
| Just one of the arguments x1, x2, ..., xk  =  ()
     −
The interpretation of these operators, read as assertions
+
Column&nbsp;3 lists the functional values for each boolean function, or possibly a boolean element appearing in the guise of a function, for each combination of its domain values.
about the values of their listed arguments, is as follows:
     −
1Existential Interpretation:  "Just one of the k argument is false."
+
Column&nbsp;4 shows the usual expressions of these elements in the cactus language, conforming to the practice of omitting the underlines in display formatsHere I illustrate also the convention of using the expression <math>^{\backprime\backprime} ((~)) ^{\prime\prime}</math> as a visible stand-in for the expression of the logical value <math>\operatorname{true},</math> a value that is minimally represented by a blank expression that tends to elude our giving it much notice in the context of more demonstrative texts.
2.  Entitative  Interpretation:  "Not just one of the k arguments is true."
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
Table 17 presents the boolean functions on one variable, <math>F^{(1)} : \underline\mathbb{B} \to \underline\mathbb{B},</math> of which there are precisely four.
   −
o-------------------o-------------------o-------------------o
+
<br>
|      Graph      |      String      |    Translation    |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        @        |        " "        |      true.      |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |        ( )        |      untrue.      |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        r        |                  |                  |
  −
|        @        |        r        |        r.        |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        r        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |        (r)        |      not r.      |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r s t      |                  |                  |
  −
|        @        |      r s t      |  r and s and t.  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r s t      |                  |                  |
  −
|      o o o      |                  |                  |
  −
|        \|/        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |    ((r)(s)(t))    |    r or s or t.  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |    r implies s.  |
  −
|        r  s    |                  |                  |
  −
|        o---o    |                  |    if r then s.  |
  −
|        |        |                  |                  |
  −
|        @        |      (r (s))      |    no r sans s.  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r  s      |                  |                  |
  −
|      o---o      |                  | r exclusive-or s. |
  −
|        \ /        |                  |                  |
  −
|        @        |      (r , s)      | r not equal to s. |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r  s      |                  |                  |
  −
|      o---o      |                  |                  |
  −
|        \ /        |                  |                  |
  −
|        o        |                  | r if & only if s. |
  −
|        |        |                  |                  |
  −
|        @        |    ((r , s))    | r equates with s. |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r  s  t      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  just one false  |
  −
|        @        |    (r , s , t)    |  out of r, s, t.  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      r  s  t      |                  |                  |
  −
|      o  o  o      |                  |                  |
  −
|      |  |  |      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  just one true  |
  −
|        @        |  ((r),(s),(t))  |  among r, s, t.  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |  genus t over    |
  −
|        r  s      |                  |  species r, s.  |
  −
|        o  o      |                  |                  |
  −
|      t  |  |      |                  |  partition t    |
  −
|      o--o--o      |                  |  among r & s.    |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  whole pie t:    |
  −
|        @        |  ( t ,(r),(s))  |  slices r, s.    |
  −
o-------------------o-------------------o-------------------o
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
{| align="center" border="1" cellpadding="6" cellspacing="0" style="text-align:center; width:80%"
 +
|+ style="height:30px" | <math>\text{Table 17.} ~~ \text{Boolean Functions on One Variable}\!</math>
 +
|- style="height:40px; background:ghostwhite"
 +
| width="14%" | <math>F\!</math>
 +
| width="14%" | <math>F\!</math>
 +
| colspan="2" | <math>F(x)\!</math>
 +
| width="24%" | <math>F\!</math>
 +
|- style="height:40px; background:ghostwhite"
 +
| width="14%" | &nbsp;
 +
| width="14%" | &nbsp;
 +
| width="24%" | <math>F(\underline{1})</math>
 +
| width="24%" | <math>F(\underline{0})</math>
 +
| width="24%" | &nbsp;
 +
|-
 +
| <math>F_0^{(1)}\!</math>
 +
| <math>F_{00}^{(1)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{(~)}\!</math>
 +
|-
 +
| <math>F_1^{(1)}\!</math>
 +
| <math>F_{01}^{(1)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} x \texttt{)}\!</math>
 +
|-
 +
| <math>F_2^{(1)}\!</math>
 +
| <math>F_{10}^{(1)}~\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>x\!</math>
 +
|-
 +
| <math>F_3^{(1)}\!</math>
 +
| <math>F_{11}^{(1)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{((~))}\!</math>
 +
|}
   −
Table 13.  The Existential Interpretation
+
<br>
o-------------------o-------------------o-------------------o
  −
|  Cactus Graph    | Cactus Expression |    Existential    |
  −
|                  |                  |  Interpretation  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        @        |        " "        |      true.      |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |        ( )        |      untrue.      |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        a        |                  |                  |
  −
|        @        |        a        |        a.        |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        a        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |        (a)        |      not a.      |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a b c      |                  |                  |
  −
|        @        |      a b c      |  a and b and c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a b c      |                  |                  |
  −
|      o o o      |                  |                  |
  −
|        \|/        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |    ((a)(b)(c))    |    a or b or c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|                  |                  |    a implies b.  |
  −
|        a  b    |                  |                  |
  −
|        o---o    |                  |    if a then b.  |
  −
|        |        |                  |                  |
  −
|        @        |      (a (b))      |    no a sans b.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b      |                  |                  |
  −
|      o---o      |                  | a exclusive-or b. |
  −
|        \ /        |                  |                  |
  −
|        @        |      (a , b)      | a not equal to b. |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b      |                  |                  |
  −
|      o---o      |                  |                  |
  −
|        \ /        |                  |                  |
  −
|        o        |                  | a if & only if b. |
  −
|        |        |                  |                  |
  −
|        @        |    ((a , b))    | a equates with b. |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b  c      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  just one false  |
  −
|        @        |    (a , b , c)    |  out of a, b, c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b  c      |                  |                  |
  −
|      o  o  o      |                  |                  |
  −
|      |  |  |      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  just one true  |
  −
|        @        |  ((a),(b),(c))  |  among a, b, c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|                  |                  |  genus a over    |
  −
|        b  c      |                  |  species b, c.  |
  −
|        o  o      |                  |                  |
  −
|      a  |  |      |                  |  partition a    |
  −
|      o--o--o      |                  |  among b & c.    |
  −
|      \  /      |                  |                  |
  −
|        \ /        |                  |  whole pie a:    |
  −
|        @        |  ( a ,(b),(c))  |  slices b, c.    |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
Here, Column&nbsp;1 codes the contents of Column&nbsp;2 in a more concise form, compressing the lists of boolean values, recorded as bits in the subscript string, into their decimal equivalents.  Naturally, the boolean constants reprise themselves in this new setting as constant functions on one variable.  Thus, one has the synonymous expressions for constant functions that are expressed in the next two chains of equations:
   −
Table 14.  The Entitative Interpretation
+
{| align="center" cellpadding="8" width="90%"
o-------------------o-------------------o-------------------o
+
|
|   Cactus Graph    | Cactus Expression |    Entitative    |
+
<math>\begin{matrix}
|                  |                  |  Interpretation  |
+
F_0^{(1)}
o-------------------o-------------------o-------------------o
+
& = &
|                  |                  |                  |
+
F_{00}^{(1)}
|        @        |        " "       |      untrue.      |
+
& = &
|                  |                  |                  |
+
\underline{0} ~:~ \underline\mathbb{B} \to \underline\mathbb{B}
o-------------------o-------------------o-------------------o
+
\\
|                  |                  |                  |
+
\\
|        o        |                  |                  |
+
F_3^{(1)}
|        |        |                  |                  |
+
& = &
|        @        |        ( )       |      true.      |
+
F_{11}^{(1)}
|                  |                  |                  |
+
& = &
o-------------------o-------------------o-------------------o
+
\underline{1} ~:~ \underline\mathbb{B} \to \underline\mathbb{B}
|                  |                  |                  |
+
\end{matrix}</math>
|        a        |                  |                  |
+
|}
|        @        |        a        |        a.        |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|        a        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |        (a)       |      not a.      |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a b c      |                  |                  |
  −
|        @        |      a b c      |    a or b or c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a b c      |                  |                  |
  −
|      o o o      |                  |                  |
  −
|        \|/        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |                  |
  −
|        @        |    ((a)(b)(c))   |  a and b and c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|                  |                  |    a implies b.  |
  −
|                  |                  |                  |
  −
|        o a      |                  |    if a then b.  |
  −
|        |        |                  |                  |
  −
|        @ b      |      (a) b        |    not a, or b.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b      |                  |                  |
  −
|      o---o      |                  | a if & only if b. |
  −
|        \ /        |                  |                  |
  −
|        @        |      (a , b)      | a equates with b. |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b      |                  |                  |
  −
|      o---o      |                  |                  |
  −
|        \ /        |                  |                  |
  −
|        o        |                  | a exclusive-or b. |
  −
|        |        |                  |                  |
  −
|        @        |    ((a , b))    | a not equal to b. |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b  c      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \   /      |                  |                  |
  −
|        \ /        |                  | not just one true |
  −
|        @        |    (a , b , c)    | out of a, b, c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a  b  c      |                  |                  |
  −
|      o--o--o      |                  |                  |
  −
|      \   /      |                  |                  |
  −
|        \ /        |                  |                  |
  −
|        o        |                  |                  |
  −
|        |        |                  |  just one true  |
  −
|        @        |  ((a , b , c))  |  among a, b, c.  |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
  −
|                  |                  |                  |
  −
|      a            |                  |                  |
  −
|      o            |                  |  genus a over    |
  −
|      |  b  c      |                  |  species b, c.  |
  −
|      o--o--o      |                  |                  |
  −
|      \   /      |                  |  partition a    |
  −
|        \ /       |                  |  among b & c.    |
  −
|        o        |                  |                  |
  −
|         |        |                  |  whole pie a:    |
  −
|        @        |  ( a ,(b),(c))  |  slices b, c.    |
  −
|                  |                  |                  |
  −
o-------------------o-------------------o-------------------o
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
As for the rest, the other two functions are easily recognized as corresponding to the one-place logical connectives, or the monadic operators on <math>\underline\mathbb{B}.</math>  Thus, the function <math>F_1^{(1)} = F_{01}^{(1)}</math> is recognizable as the negation operation, and the function <math>F_2^{(1)} = F_{10}^{(1)}</math> is obviously the identity operation.
   −
o-----------------o-----------------o-----------------o-----------------o
+
Table&nbsp;18 presents the boolean functions on two variables, <math>F^{(2)} : \underline\mathbb{B}^2 \to \underline\mathbb{B},</math> of which there are precisely sixteen.
|      Graph      |    String      |  Entitative    |  Existential  |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|        @        |      " "      |    untrue.    |      true.      |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|        o        |                |                |                |
  −
|        |        |                |                |                |
  −
|        @        |      ( )      |      true.      |    untrue.    |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|        r        |                |                |                |
  −
|        @        |        r        |        r.      |        r.      |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|        r        |                |                |                |
  −
|        o        |                |                |                |
  −
|        |        |                |                |                |
  −
|        @        |      (r)      |      not r.    |      not r.    |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|      r s t      |                |                |                |
  −
|        @        |      r s t      |  r or s or t.  |  r and s and t. |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|      r s t      |                |                |                |
  −
|      o o o      |                |                |                |
  −
|      \|/      |                |                |                |
  −
|        o        |                |                |                |
  −
|        |        |                |                |                |
  −
|        @        |  ((r)(s)(t))  |  r and s and t. |  r or s or t.  |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |  r implies s.  |
  −
|                |                |                |                |
  −
|        o r      |                |                |  if r then s.  |
  −
|        |        |                |                |                |
  −
|        @ s      |      (r) s      |  not r, or s    |  no r sans s.  |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |  r implies s.  |
  −
|        r  s    |                |                |                |
  −
|        o---o    |                |                |  if r then s.  |
  −
|        |        |                |                |                |
  −
|        @        |    (r (s))    |                |  no r sans s.  |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|      r  s      |                |                |                |
  −
|      o---o      |                |                |r exclusive-or s.|
  −
|      \ /      |                |                |                |
  −
|        @        |    (r , s)    |                |r not equal to s.|
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|      r  s      |                |                |                |
  −
|      o---o      |                |                |                |
  −
|      \ /      |                |                |                |
  −
|        o        |                |                |r if & only if s.|
  −
|        |        |                |                |                |
  −
|        @        |    ((r , s))    |                |r equates with s.|
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|    r  s  t    |                |                |                |
  −
|    o--o--o    |                |                |                |
  −
|      \   /      |                |                |                |
  −
|      \ /      |                |                | just one false  |
  −
|        @        |  (r , s , t)  |                | out of r, s, t. |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |                |
  −
|    r  s  t    |                |                |                |
  −
|    o  o  o    |                |                |                |
  −
|    |  |  |    |                |                |                |
  −
|    o--o--o    |                |                |                |
  −
|      \   /      |                |                |                |
  −
|      \ /      |                |                |  just one true  |
  −
|        @        |  ((r),(s),(t))  |                |  among r, s, t. |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
|                |                |                |  genus t over  |
  −
|        r  s    |                |                |  species r, s.  |
  −
|        o  o    |                |                |                |
  −
|    t  |  |    |                |                |  partition t    |
  −
|    o--o--o    |                |                |  among r & s.  |
  −
|      \  /      |                |                |                |
  −
|      \ /      |                |                |  whole pie t:  |
  −
|        @        |  ( t ,(r),(s))  |                |  slices r, s.  |
  −
o-----------------o-----------------o-----------------o-----------------o
  −
</pre>
     −
===Differential Logic===
+
<br>
   −
<pre>
+
{| align="center" border="1" cellpadding="4" cellspacing="0" style="text-align:center; width:80%"
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
|+ style="height:30px" | <math>\text{Table 18.} ~~ \text{Boolean Functions on Two Variables}\!</math>
 
+
|- style="height:40px; background:ghostwhite"
Note 1
+
| width="14%" | <math>F\!</math>
 
+
| width="14%" | <math>F\!</math>
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
| colspan="4" | <math>F(x, y)\!</math>
 +
| width="24%" | <math>F\!</math>
 +
|- style="height:40px; background:ghostwhite"
 +
| width="14%" | &nbsp;
 +
| width="14%" | &nbsp;
 +
| width="12%" | <math>F(\underline{1}, \underline{1})</math>
 +
| width="12%" | <math>F(\underline{1}, \underline{0})</math>
 +
| width="12%" | <math>F(\underline{0}, \underline{1})</math>
 +
| width="12%" | <math>F(\underline{0}, \underline{0})</math>
 +
| width="24%" | &nbsp;
 +
|-
 +
| <math>F_{0}^{(2)}\!</math>
 +
| <math>F_{0000}^{(2)}~\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{(~)}\!</math>
 +
|-
 +
| <math>F_{1}^{(2)}\!</math>
 +
| <math>F_{0001}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} x \texttt{)(} y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{2}^{(2)}\!</math>
 +
| <math>F_{0010}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{(} x \texttt{)} y\!</math>
 +
|-
 +
| <math>F_{3}^{(2)}\!</math>
 +
| <math>F_{0011}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} x \texttt{)}\!</math>
 +
|-
 +
| <math>F_{4}^{(2)}\!</math>
 +
| <math>F_{0100}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>x \texttt{(} y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{5}^{(2)}\!</math>
 +
| <math>F_{0101}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{6}^{(2)}\!</math>
 +
| <math>F_{0110}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{(} x \texttt{,} y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{7}^{(2)}\!</math>
 +
| <math>F_{0111}^{(2)}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} x y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{8}^{(2)}\!</math>
 +
| <math>F_{1000}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>x y\!</math>
 +
|-
 +
| <math>F_{9}^{(2)}\!</math>
 +
| <math>F_{1001}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{((} x \texttt{,} y \texttt{))}\!</math>
 +
|-
 +
| <math>F_{10}^{(2)}\!</math>
 +
| <math>F_{1010}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>y\!</math>
 +
|-
 +
| <math>F_{11}^{(2)}\!</math>
 +
| <math>F_{1011}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{(} x \texttt{(} y \texttt{))}\!</math>
 +
|-
 +
| <math>F_{12}^{(2)}\!</math>
 +
| <math>F_{1100}^{(2)}~\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>x\!</math>
 +
|-
 +
| <math>F_{13}^{(2)}\!</math>
 +
| <math>F_{1101}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{((} x \texttt{)} y \texttt{)}\!</math>
 +
|-
 +
| <math>F_{14}^{(2)}\!</math>
 +
| <math>F_{1110}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{0}\!</math>
 +
| <math>\texttt{((} x \texttt{)(} y \texttt{))}\!</math>
 +
|-
 +
| <math>F_{15}^{(2)}\!</math>
 +
| <math>F_{1111}^{(2)}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\underline{1}\!</math>
 +
| <math>\texttt{((~))}\!</math>
 +
|}
   −
One of the first things that you can do, once you
+
<br>
have a really decent calculus for boolean functions
  −
or propositional logic, whatever you want to call it,
  −
is to compute the differentials of these functions or
  −
propositions.
     −
Now there are many ways to dance around this idea,
+
As before, all of the boolean functions of fewer variables are subsumed in this Table, though under a set of alternative names and possibly different interpretations.  Just to acknowledge a few of the more notable pseudonyms:
and I feel like I have tried them all, before one
  −
gets down to acting on it, and there many issues
  −
of interpretation and justification that we will
  −
have to clear up after the fact, that is, before
  −
we can be sure that it all really makes any sense,
  −
but I think this time I'll just jump in, and show
  −
you the form in which this idea first came to me.
     −
Start with a proposition of the form x & y, which
+
: The constant function <math>\underline{0} ~:~ \underline\mathbb{B}^2 \to \underline\mathbb{B}</math> appears under the name <math>F_{0}^{(2)}.</math>
I graph as two labels attached to a root node, so:
     −
o---------------------------------------o
+
: The constant function <math>\underline{1} ~:~ \underline\mathbb{B}^2 \to \underline\mathbb{B}</math> appears under the name <math>F_{15}^{(2)}.</math>
|                                      |
  −
|                  x y                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|                x and y                |
  −
o---------------------------------------o
     −
Written as a string, this is just the concatenation "x y".
+
: The negation and identity of the first variable are <math>F_{3}^{(2)}</math> and <math>F_{12}^{(2)},</math> respectively.
   −
The proposition xy may be taken as a boolean function f(x, y)
+
: The negation and identity of the second variable are <math>F_{5}^{(2)}</math> and <math>F_{10}^{(2)},</math> respectively.
having the abstract type f : B x B -> B, where B = {0, 1} is
  −
read in such a way that 0 means "false" and 1 means "true".
     −
In this style of graphical representation,
+
: The logical conjunction is given by the function <math>F_{8}^{(2)} (x, y) = x \cdot y.</math>
the value "true" looks like a blank label
  −
and the value "false" looks like an edge.
     −
o---------------------------------------o
+
: The logical disjunction is given by the function <math>F_{14}^{(2)} (x, y) = \underline{((} ~x~ \underline{)(} ~y~ \underline{))}.</math>
|                                      |
  −
|                                      |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|                true                  |
  −
o---------------------------------------o
     −
o---------------------------------------o
+
Functions expressing the ''conditionals'', ''implications'', or ''if-then'' statements are given in the following ways:
|                                      |
  −
|                  o                  |
  −
|                  |                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|                false                |
  −
o---------------------------------------o
     −
Back to the proposition xy. Imagine yourself standing
+
: <math>[x \Rightarrow y] = F_{11}^{(2)} (x, y) = \underline{(} ~x~ \underline{(} ~y~ \underline{))} = [\operatorname{not}~ x ~\operatorname{without}~ y].</math>
in a fixed cell of the corresponding venn diagram, say,
  −
the cell where the proposition xy is true, as pictured:
     −
o---------------------------------------o
+
: <math>[x \Leftarrow y] = F_{13}^{(2)} (x, y) = \underline{((} ~x~ \underline{)} ~y~ \underline{)} = [\operatorname{not}~ y ~\operatorname{without}~ x].</math>
|                                      |
  −
|                o    o                |
  −
|              / \   / \              |
  −
|              /  \ /  \              |
  −
|            /    ·    \            |
  −
|            /    /%\    \            |
  −
|          /    /%%%\    \          |
  −
|          /    /%%%%%\    \          |
  −
|        /    /%%%%%%%\    \        |
  −
|        /    /%%%%%%%%%\    \        |
  −
|      o  x o%%%%%%%%%%%o  y o      |
  −
|        \     \%%%%%%%%%/    /        |
  −
|        \     \%%%%%%%/    /        |
  −
|          \    \%%%%%/    /          |
  −
|          \    \%%%/    /          |
  −
|            \    \%/    /            |
  −
|            \    ·    /            |
  −
|              \  / \  /              |
  −
|              \ /  \ /               |
  −
|                o    o                |
  −
|                                      |
  −
o---------------------------------------o
     −
Now ask yourself:  What is the value of the
+
The function that corresponds to the ''biconditional'', the ''equivalence'', or the ''if and only'' statement is exhibited in the following fashion:
proposition xy at a distance of dx and dy
  −
from the cell xy where you are standing?
     −
Don't think about it -- just compute:
+
: <math>[x \Leftrightarrow y] = [x = y] = F_{9}^{(2)} (x, y) = \underline{((} ~x~,~y~ \underline{))}.</math>
   −
o---------------------------------------o
+
Finally, there is a boolean function that is logically associated with the ''exclusive disjunction'', ''inequivalence'', or ''not equals'' statement, algebraically associated with the ''binary sum'' operation, and geometrically associated with the ''symmetric difference'' of sets.  This function is given by:
|                                      |
  −
|              dx o  o dy              |
  −
|                / \ / \                |
  −
|            x o---@---o y            |
  −
|                                      |
  −
o---------------------------------------o
  −
|        (x + dx) and (y + dy)        |
  −
o---------------------------------------o
     −
To make future graphs easier to draw in Ascii land,
+
: <math>[x \neq y] = [x + y] = F_{6}^{(2)} (x, y) = \underline{(} ~x~,~y~ \underline{)}.</math>
I will use devices like @=@=@ and o=o=o to identify
  −
several nodes into one, as in this next redrawing:
     −
o---------------------------------------o
+
Let me now address one last question that may have occurred to some. What has happened, in this suggested scheme of functional reasoning, to the distinction that is quite pointedly made by careful logicians between (1) the connectives called ''conditionals'' and symbolized by the signs <math>(\rightarrow)</math> and <math>(\leftarrow),</math> and (2) the assertions called ''implications'' and symbolized by the signs <math>(\Rightarrow)</math> and <math>(\Leftarrow)</math>, and, in a related question:  What has happened to the distinction that is equally insistently made between (3) the connective called the ''biconditional'' and signified by the sign <math>(\leftrightarrow)</math> and (4) the assertion that is called an ''equivalence'' and signified by the sign <math>(\Leftrightarrow)</math>?  My answer is this:  For my part, I am deliberately avoiding making these distinctions at the level of syntax, preferring to treat them instead as distinctions in the use of boolean functions, turning on whether the function is mentioned directly and used to compute values on arguments, or whether its inverse is being invoked to indicate the fibers of truth or untruth under the propositional function in question.
|                                      |
  −
|              x dx y  dy              |
  −
|              o---o o---o              |
  −
|              \ | |  /               |
  −
|                \ | | /               |
  −
|                \| |/                 |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|        (x + dx) and (y + dy)         |
  −
o---------------------------------------o
     −
However you draw it, these expressions follow because the
+
===Stretching Exercises===
expression x + dx, where the plus sign indicates (mod 2)
  −
addition in B, and thus corresponds to an exclusive-or
  −
in logic, parses to a graph of the following form:
     −
o---------------------------------------o
+
The arrays of boolean connections described above, namely, the boolean functions <math>F^{(k)} : \underline\mathbb{B}^k \to \underline\mathbb{B},</math> for <math>k\!</math> in <math>\{ 0, 1, 2 \},\!</math> supply enough material to demonstrate the use of the stretch operation in a variety of concrete cases.
|                                      |
  −
|                x    dx                |
  −
|                o---o                |
  −
|                  \ /                 |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|                x + dx                |
  −
o---------------------------------------o
     −
Next question: What is the difference between
+
For example, suppose that <math>F\!</math> is a connection of the form <math>F : \underline\mathbb{B}^2 \to \underline\mathbb{B},</math> that is, any one of the sixteen possibilities in Table&nbsp;18, while <math>p\!</math> and <math>q\!</math> are propositions of the form <math>p, q : X \to \underline\mathbb{B},</math> that is, propositions about things in the universe <math>X,\!</math> or else the indicators of sets contained in <math>X.\!</math>
the value of the proposition xy "over there" and
  −
the value of the proposition xy where you are, all
  −
expressed as general formula, of course?  Here 'tis:
     −
o---------------------------------------o
+
Then one has the imagination <math>\underline{f} = (f_1, f_2) = (p, q) : (X \to \underline\mathbb{B})^2,</math> and the stretch of the connection <math>F\!</math> to <math>\underline{f}\!</math> on <math>X\!</math> amounts to a proposition <math>F^\$ (p, q) : X \to \underline\mathbb{B}</math> that may be read as the ''stretch of <math>F\!</math> to <math>p\!</math> and <math>q.\!</math>''  If one is concerned with many different propositions about things in <math>X,\!</math> or if one is abstractly indifferent to the particular choices for <math>p\!</math> and <math>q,\!</math> then one may detach the operator <math>F^\$ : (X \to \underline\mathbb{B}))^2 \to (X \to \underline\mathbb{B})),</math> called the ''stretch of <math>F\!</math> over <math>X,\!</math>'' and consider it in isolation from any concrete application.
|                                      |
  −
|        x  dx y  dy                    |
  −
|        o---o o---o                    |
  −
|        \ | |  /                     |
  −
|          \ | | /                     |
  −
|          \| |/         x y          |
  −
|            o=o-----------o            |
  −
|            \           /             |
  −
|              \         /             |
  −
|              \       /               |
  −
|                \     /               |
  −
|                \   /                 |
  −
|                  \ /                 |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|      ((x + dx) & (y + dy)) - xy      |
  −
o---------------------------------------o
     −
Oh, I forgot to mention:  Computed over B,
+
When the cactus notation is used to represent boolean functions, a single <math>\$</math> sign at the end of the expression is enough to remind the reader that the connections are meant to be stretched to several propositions on a universe <math>X.\!</math>
plus and minus are the very same operation.
  −
This will make the relationship between the
  −
differential and the integral parts of the
  −
resulting calculus slightly stranger than
  −
usual, but never mind that now.
     −
Last question, for now: What is the value of this expression
+
For example, take the connection <math>F : \underline\mathbb{B}^2 \to \underline\mathbb{B}</math> such that:
from your current standpoint, that is, evaluated at the point
  −
where xy is true?  Well, substituting 1 for x and 1 for y in
  −
the graph amounts to the same thing as erasing those labels:
     −
o---------------------------------------o
+
: <math>F(x, y) ~=~ F_{6}^{(2)} (x, y) ~=~ \underline{(}~x~,~y~\underline{)}\!</math>
|                                      |
  −
|          dx    dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /                      |
  −
|          \| |/                      |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|      ((1 + dx) & (1 + dy)) - 1·1      |
  −
o---------------------------------------o
     −
And this is equivalent to the following graph:
+
The connection in question is a boolean function on the variables <math>x, y\!</math> that returns a value of <math>\underline{1}</math> just when just one of the pair <math>x, y\!</math> is not equal to <math>\underline{1},</math> or what amounts to the same thing, just when just one of the pair <math>x, y\!</math> is equal to <math>\underline{1}.</math>  There is clearly an isomorphism between this connection, viewed as an operation on the boolean domain <math>\underline\mathbb{B} = \{ \underline{0}, \underline{1} \},</math> and the dyadic operation on binary values <math>x, y \in \mathbb{B} = \operatorname{GF}(2)\!</math> that is otherwise known as <math>x + y.\!</math>
   −
o---------------------------------------o
+
The same connection <math>F : \underline\mathbb{B}^2 \to \underline\mathbb{B}</math> can also be read as a proposition about things in the universe <math>X = \underline\mathbb{B}^2.</math>  If <math>s\!</math> is a sentence that denotes the proposition <math>F,\!</math> then the corresponding assertion says exactly what one states in uttering the sentence <math>^{\backprime\backprime} \, x ~\operatorname{is~not~equal~to}~ y \, ^{\prime\prime}.</math>  In such a case, one has <math>\downharpoonleft s \downharpoonright \, = F,</math> and all of the following expressions are ordinarily taken as equivalent descriptions of the same set:
|                                      |
  −
|                dx  dy                |
  −
|                o  o                |
  −
|                  \ /                 |
  −
|                  o                  |
  −
|                  |                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
|              dx or dy                |
  −
o---------------------------------------o
     −
Have to break here -- will explain later.
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{array}{lll}
 +
[| \downharpoonleft s \downharpoonright |]
 +
& = & [| F |]
 +
\\[6pt]
 +
& = & F^{-1} (\underline{1})
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ s ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ F(x, y) = \underline{1} ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ F(x, y) ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ \underline{(}~x~,~y~\underline{)} = \underline{1} ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ \underline{(}~x~,~y~\underline{)} ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ x ~\operatorname{exclusive~or}~ y ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ \operatorname{just~one~true~of}~ x, y ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ x ~\operatorname{not~equal~to}~ y ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ x \nLeftrightarrow y ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ x \neq y ~\}
 +
\\[6pt]
 +
& = & \{~ (x, y) \in \underline\mathbb{B}^2 ~:~ x + y ~\}.
 +
\end{array}</math>
 +
|}
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
Notice the distinction, that I continue to maintain at this point, between the logical values <math>\{ \operatorname{falsehood}, \operatorname{truth} \}</math> and the algebraic values <math>\{ 0, 1 \}.\!</math>  This makes it legitimate to write a sentence directly into the righthand side of a set-builder expression, for instance, weaving the sentence <math>s\!</math> or the sentence <math>^{\backprime\backprime} \, x ~\operatorname{is~not~equal~to}~ y \, ^{\prime\prime}</math> into the context <math>^{\backprime\backprime} \, \{ (x, y) \in \underline{B}^2 : \ldots \} \, ^{\prime\prime},</math> thereby obtaining the corresponding expressions listed above.  It also allows us to assert the proposition <math>F(x, y)\!</math> in a more direct way, without detouring through the equation <math>F(x, y) = \underline{1},</math> since it already has a value in <math>\{ \operatorname{falsehood}, \operatorname{true} \},</math> and thus can be taken as tantamount to an actual sentence.
   −
Note 2
+
If the appropriate safeguards can be kept in mind, avoiding all danger of confusing propositions with sentences and sentences with assertions, then the marks of these distinctions need not be forced to clutter the account of the more substantive indications, that is, the ones that really matter.  If this level of understanding can be achieved, then it may be possible to relax these restrictions, along with the absolute dichotomy between algebraic and logical values, which tends to inhibit the flexibility of interpretation.
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
This covers the properties of the connection <math>F(x, y) = \underline{(}~x~,~y~\underline{)},</math> treated as a proposition about things in the universe <math>X = \underline\mathbb{B}^2.</math>  Staying with this same connection, it is time to demonstrate how it can be "stretched" to form an operator on arbitrary propositions.
   −
We have just met with the fact that
+
To continue the exercise, let <math>p\!</math> and <math>q\!</math> be arbitrary propositions about things in the universe <math>X,\!</math> that is, maps of the form <math>p, q : X \to \underline\mathbb{B},</math> and suppose that <math>p, q\!</math> are indicator functions of the sets <math>P, Q \subseteq X,</math> respectively. In other words, we have the following data:
the differential of the "and" is
  −
the "or" of the differentials.
     −
x and y  --Diff--> dx or dy.
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{matrix}
 +
p
 +
& = &
 +
\upharpoonleft P \upharpoonright
 +
& : &
 +
X \to \underline\mathbb{B}
 +
\\
 +
\\
 +
q
 +
& = &
 +
\upharpoonleft Q \upharpoonright
 +
& : &
 +
X \to \underline\mathbb{B}
 +
\\
 +
\\
 +
(p, q)
 +
& = &
 +
(\upharpoonleft P \upharpoonright, \upharpoonleft Q \upharpoonright)
 +
& : &
 +
(X \to \underline\mathbb{B})^2
 +
\\
 +
\end{matrix}</math>
 +
|}
   −
o---------------------------------------o
+
Then one has an operator <math>F^\$,</math> the stretch of the connection <math>F\!</math> over <math>X,\!</math> and a proposition <math>F^\$ (p, q),</math> the stretch of <math>F\!</math> to <math>(p, q)\!</math> on <math>X,\!</math> with the following properties:
|                                      |
  −
|                            dx  dy  |
  −
|                              o  o    |
  −
|                              \ /     |
  −
|                                o      |
  −
|      x y                      |      |
  −
|      @      --Diff-->       @      |
  −
|                                      |
  −
o---------------------------------------o
  −
|      x y      --Diff-->   ((dx)(dy))  |
  −
o---------------------------------------o
     −
It will be necessary to develop a more refined analysis of
+
{| align="center" cellpadding="8" width="90%"
this statement directly, but that is roughly the nub of it.
+
|
 +
<math>\begin{array}{ccccl}
 +
F^\$
 +
& = &
 +
\underline{(} \ldots, \ldots \underline{)}^\$
 +
& : &
 +
(X \to \underline\mathbb{B})^2 \to (X \to \underline\mathbb{B})
 +
\\
 +
\\
 +
F^\$ (p, q)
 +
& = &
 +
\underline{(}~p~,~q~\underline{)}^\$
 +
& : &
 +
X \to \underline\mathbb{B}
 +
\\
 +
\end{array}</math>
 +
|}
   −
If the form of the above statement reminds you of DeMorgan's rule,
+
As a result, the application of the proposition <math>F^\$ (p, q)</math> to each <math>x \in X</math> returns a logical value in <math>\underline\mathbb{B},</math> all in accord with the following equations:
it is no accident, as differentiation and negation turn out to be
  −
closely related operations.  Indeed, one can find discussions of
  −
logical difference calculus in the Boole-DeMorgan correspondence
  −
and Peirce also made use of differential operators in a logical
  −
context, but the exploration of these ideas has been hampered
  −
by a number of factors, not the least of which being a syntax
  −
adequate to handle the complexity of expressions that evolve.
     −
For my part, it was definitely a case of the calculus being smarter
+
{| align="center" cellpadding="8" width="90%"
than the calculator thereof.  The graphical pictures were catalytic
+
|
in their power over my thinking process, leading me so quickly past
+
<math>\begin{matrix}
so many obstructions that I did not have time to think about all of
+
F^\$ (p, q)(x) & = & \underline{(}~p~,~q~\underline{)}^\$ (x) & \in & \underline\mathbb{B}
the difficulties that would otherwise have inhibited the derivation.
+
\\
It did eventually became necessary to write all this up in a linear
+
\\
script, and to deal with the various problems of interpretation and
+
\Updownarrow  &  & \Updownarrow
justification that I could imagine, but that took another 120 pages,
+
\\
and so, if you don't like this intuitive approach, then let that be
+
\\
your sufficient notice.
+
F(p(x), q(x))  & = & \underline{(}~p(x)~,~q(x)~\underline{)}  & \in & \underline\mathbb{B}
 +
\\
 +
\end{matrix}</math>
 +
|}
   −
Let us run through the initial example again, this time attempting
+
For each choice of propositions <math>p\!</math> and <math>q\!</math> about things in <math>X,\!</math> the stretch of <math>F\!</math> to <math>p\!</math> and <math>q\!</math> on <math>X\!</math> is just another proposition about things in <math>X,\!</math> a simple proposition in its own right, no matter how complex its current expression or its present construction as <math>F^\$ (p, q) = \underline{(}~p~,~q~\underline{)}^\$</math> makes it appear in relation to <math>p\!</math> and <math>q.\!</math>  Like any other proposition about things in <math>X,\!</math> it indicates a subset of <math>X,\!</math> namely, the fiber that is variously described in the following ways:
to interpret the formulas that develop at each stage along the way.
     −
We begin with a proposition or a boolean function f(x, y) = xy.
+
{| align="center" cellpadding="8" width="90%"
 +
|
 +
<math>\begin{array}{lll}
 +
[| F^\$ (p, q) |]
 +
& = & [| \underline{(}~p~,~q~\underline{)}^\$ |]
 +
\\[6pt]
 +
& = & (F^\$ (p, q))^{-1} (\underline{1})
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ F^\$ (p, q)(x) ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ \underline{(}~p~,~q~\underline{)}^\$ (x) ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ \underline{(}~p(x)~,~q(x)~\underline{)} ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ p(x) + q(x) ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ p(x) \neq q(x) ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ \upharpoonleft P \upharpoonright (x) ~\neq~ \upharpoonleft Q \upharpoonright (x) ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ x \in P ~\nLeftrightarrow~ x \in Q ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ x \in P\!-\!Q ~\operatorname{or}~ x \in Q\!-\!P ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ x \in P\!-\!Q ~\cup~ Q\!-\!P ~\}
 +
\\[6pt]
 +
& = & \{~ x \in X ~:~ x \in P + Q ~\}
 +
\\[6pt]
 +
& = & P + Q ~\subseteq~ X
 +
\\[6pt]
 +
& = & [|p|] + [|q|] ~\subseteq~ X
 +
\end{array}</math>
 +
|}
   −
o---------------------------------------o
+
==References==
|                                      |
  −
|                o    o                |
  −
|              / \  / \              |
  −
|              /  \ /  \              |
  −
|            /    ·    \            |
  −
|            /    /`\    \            |
  −
|          /    /```\    \          |
  −
|          /    /`````\    \          |
  −
|        /    /```````\    \        |
  −
|        /    /`````````\    \        |
  −
|      o  x  o`````f`````o  y  o      |
  −
|        \    \`````````/    /        |
  −
|        \    \```````/    /        |
  −
|          \    \`````/    /          |
  −
|          \    \```/    /          |
  −
|            \    \`/    /            |
  −
|            \    ·    /            |
  −
|              \  / \  /              |
  −
|              \ /  \ /              |
  −
|                o    o                |
  −
|                                      |
  −
o---------------------------------------o
  −
|                                      |
  −
|                  x y                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| f =             x y                  |
  −
o---------------------------------------o
     −
A function like this has an abstract type and a concrete type.
+
* Bernstein, Herbert J. (1987), "Idols of Modern Science and The Reconstruction of Knowledge", pp. 37&ndash;68 in Marcus G. Raskin and Herbert J. Bernstein, ''New Ways of Knowing : The Sciences, Society, and Reconstructive Knowledge'', Rowman and Littlefield, Totowa, NJ, 1987.
The abstract type is what we invoke when we write things like
  −
f : B x B -> B or f : B^2 -> B.  The concrete type takes into
  −
account the qualitative dimensions or the "units" of the case,
  −
which can be explained as follows.
     −
1. Let X be the set of values {(x), x} = {not x, x}.
+
* Denning, P.J., Dennis, J.B., and Qualitz, J.E. (1978), ''Machines, Languages, and Computation'', Prentice-Hall, Englewood Cliffs, NJ.
   −
2. Let Y be the set of values {(y), y} = {not y, y}.
+
* Nietzsche, Friedrich, ''Beyond Good and Evil : Prelude to a Philosophy of the Future'', R.J. Hollingdale (trans.), Michael Tanner (intro.), Penguin Books, London, UK, 1973, 1990.
   −
Then interpret the usual propositions about x, y
+
* Raskin, Marcus G., and Bernstein, Herbert J. (1987, eds.), ''New Ways of Knowing : The Sciences, Society, and Reconstructive Knowledge'', Rowman and Littlefield, Totowa, NJ.
as functions of the concrete type f : X x Y -> B.
     −
We are going to consider various "operators" on these functions.
+
==Document History==
Here, an operator F is a function that takes one function f into
  −
another function Ff.
     −
The first couple of operators that we need to consider are logical analogues
+
===The Cactus Patch===
of those that occur in the classical "finite difference calculus", namely:
     −
1. The "difference" operator [capital Delta], written here as D.
+
<pre>
 
+
| Subject: Inquiry Driven Systems : An Inquiry Into Inquiry
2The "enlargement" operator [capital Epsilon], written here as E.
+
| Contact:  Jon Awbrey
 
+
| Version:  Draft 8.70
These days, E is more often called the "shift" operator.
+
| Created: 23 Jun 1996
 
+
| Revised:  06 Jan 2002
In order to describe the universe in which these operators operate,
+
| Advisor:  M.A. Zohdy
it will be necessary to enlarge our original universe of discourse.
+
| Setting:  Oakland University, Rochester, Michigan, USA
We mount up from the space U = X x Y to its "differential extension",
+
| Excerpt:  Section 1.3.10 (Recurring Themes)
EU = U x dU = X x Y x dX x dY, with dX = {(dx), dx} and dY = {(dy), dy}.
+
| Excerpt:  Subsections 1.3.10.8 - 1.3.10.13
The interpretations of these new symbols can be diverse, but the easiest
+
</pre>
for now is just to say that dx means "change x" and dy means "change y".
  −
To draw the differential extension EU of our present universe U = X x Y
  −
as a venn diagram, it would take us four logical dimensions X, Y, dX, dY,
  −
but we can project a suggestion of what it's about on the universe X x Y
  −
by drawing arrows that cross designated borders, labeling the arrows as
  −
dx when crossing the border between x and (x) and as dy when crossing
  −
the border between y and (y), in either direction, in either case.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                o    o                |
  −
|              / \  / \              |
  −
|              /  \ /  \              |
  −
|            /    ·    \            |
  −
|            / dy  /`\  dx \            |
  −
|          /  ^ /```\ ^  \          |
  −
|          /    \`````/    \          |
  −
|        /    /`\```/`\    \        |
  −
|        /    /```\`/```\    \        |
  −
|      o  x  o`````o`````o  y  o      |
  −
|        \    \`````````/    /        |
  −
|        \    \```````/    /        |
  −
|          \    \`````/    /          |
  −
|          \    \```/    /          |
  −
|            \    \`/    /            |
  −
|            \    ·    /            |
  −
|              \  / \  /              |
  −
|              \ /  \ /              |
  −
|                o    o                |
  −
|                                      |
  −
o---------------------------------------o
  −
 
  −
We can form propositions from these differential variables in the same way
  −
that we would any other logical variables, for instance, interpreting the
  −
proposition (dx (dy)) to say "dx => dy", in other words, however you wish
  −
to take it, whether indicatively or injunctively, as saying something to
  −
the effect that there is "no change in x without a change in y".
  −
 
  −
Given the proposition f(x, y) in U = X x Y,
  −
the (first order) 'enlargement' of f is the
  −
proposition Ef in EU that is defined by the
  −
formula Ef(x, y, dx, dy) = f(x + dx, y + dy).
  −
 
  −
In the example f(x, y) = xy, we obtain:
  −
 
  −
Ef(x, y, dx, dy)  =  (x + dx)(y + dy).
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|              x  dx y  dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef =      (x, dx) (y, dy)            |
  −
o---------------------------------------o
  −
 
  −
Given the proposition f(x, y) in U = X x Y,
  −
the (first order) 'difference' of f is the
  −
proposition Df in EU that is defined by the
  −
formula Df = Ef - f, or, written out in full,
  −
Df(x, y, dx, dy) = f(x + dx, y + dy) - f(x, y).
  −
 
  −
In the example f(x, y) = xy, the result is:
  −
 
  −
Df(x, y, dx, dy)  =  (x + dx)(y + dy) - xy.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|        x  dx y  dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /                      |
  −
|          \| |/        x y          |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df =      ((x, dx)(y, dy), xy)      |
  −
o---------------------------------------o
  −
 
  −
We did not yet go through the trouble to interpret this (first order)
  −
"difference of conjunction" fully, but were happy simply to evaluate
  −
it with respect to a single location in the universe of discourse,
  −
namely, at the point picked out by the singular proposition xy,
  −
in as much as if to say, at the place where x = 1 and y = 1.
  −
This evaluation is written in the form Df|xy or Df|<1, 1>,
  −
and we arrived at the locally applicable law that states
  −
that f = xy = x & y  =>  Df|xy = ((dx)(dy)) = dx or dy.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                dx dy                |
  −
|                  ^                  |
  −
|                o  |  o                |
  −
|              / \ | / \              |
  −
|              /  \|/  \              |
  −
|            /dy  |  dx\            |
  −
|            /(dx) /|\ (dy)\            |
  −
|          /  ^ /`|`\ ^  \          |
  −
|          /    \``|``/    \          |
  −
|        /    /`\`|`/`\    \        |
  −
|        /    /```\|/```\    \        |
  −
|      o  x  o`````o`````o  y  o      |
  −
|        \    \`````````/    /        |
  −
|        \    \```````/    /        |
  −
|          \    \`````/    /          |
  −
|          \    \```/    /          |
  −
|            \    \`/    /            |
  −
|            \    ·    /            |
  −
|              \  / \  /              |
  −
|              \ /  \ /              |
  −
|                o    o                |
  −
|                                      |
  −
o---------------------------------------o
  −
|                                      |
  −
|                dx  dy                |
  −
|                o  o                |
  −
|                  \ /                  |
  −
|                  o                  |
  −
|                  |                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df|xy =      ((dx)(dy))              |
  −
o---------------------------------------o
  −
 
  −
The picture illustrates the analysis of the inclusive disjunction ((dx)(dy))
  −
into the exclusive disjunction:  dx(dy) + dy(dx) + dx dy, a proposition that
  −
may be interpreted to say "change x or change y or both".  And this can be
  −
recognized as just what you need to do if you happen to find yourself in
  −
the center cell and desire a detailed description of ways to depart it.
  −
 
  −
Jon Awbrey --
  −
 
  −
Formerly Of:
  −
Center Cell,
  −
Chateau Dif.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 3
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Last time we computed what will variously be called
  −
the "difference map", the "difference proposition",
  −
or the "local proposition" Df_p for the proposition
  −
f(x, y) = xy at the point p where x = 1 and y = 1.
  −
 
  −
In the universe U = X x Y, the four propositions
  −
xy, x(y), (x)y, (x)(y) that indicate the "cells",
  −
or the smallest regions of the venn diagram, are
  −
called "singular propositions".  These serve as
  −
an alternative notation for naming the points
  −
<1, 1>, <1, 0>, <0, 1>, <0, 0>, respectively.
  −
 
  −
Thus, we can write Df_p = Df|p = Df|<1, 1> = Df|xy,
  −
so long as we know the frame of reference in force.
  −
 
  −
Sticking with the example f(x, y) = xy, let us compute the
  −
value of the difference proposition Df at all of the points.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|        x  dx y  dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /                      |
  −
|          \| |/        x y          |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df =      ((x, dx)(y, dy), xy)        |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|          dx    dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /                      |
  −
|          \| |/                      |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df|xy =      ((dx)(dy))              |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|              o                        |
  −
|          dx |  dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /        o            |
  −
|          \| |/          |            |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df|x(y) =      (dx) dy                |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|        o                              |
  −
|        |  dx    dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /        o            |
  −
|          \| |/          |            |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df|(x)y =      dx (dy)                |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|        o    o                        |
  −
|        |  dx |  dy                    |
  −
|        o---o o---o                    |
  −
|        \  | |  /                    |
  −
|          \ | | /      o  o          |
  −
|          \| |/        \ /          |
  −
|            o=o-----------o            |
  −
|            \          /            |
  −
|              \        /              |
  −
|              \      /              |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  @                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Df|(x)(y) =    dx dy                |
  −
o---------------------------------------o
  −
 
  −
The easy way to visualize the values of these graphical
  −
expressions is just to notice the following equivalents:
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|  x                                    |
  −
|  o-o-o-...-o-o-o                      |
  −
|  \          /                      |
  −
|    \        /                        |
  −
|    \      /                        |
  −
|      \    /                x        |
  −
|      \  /                o        |
  −
|        \ /                  |        |
  −
|        @        =        @        |
  −
|                                      |
  −
o---------------------------------------o
  −
|  (x, , ... , , )  =        (x)        |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                o                      |
  −
| x_1 x_2  x_k  |                      |
  −
|  o---o-...-o---o                      |
  −
|  \          /                      |
  −
|    \        /                        |
  −
|    \      /                        |
  −
|      \    /                          |
  −
|      \  /                          |
  −
|        \ /            x_1 ... x_k    |
  −
|        @        =        @        |
  −
|                                      |
  −
o---------------------------------------o
  −
| (x_1, ..., x_k, ()) = x_1 · ... · x_k |
  −
o---------------------------------------o
  −
 
  −
Laying out the arrows on the augmented venn diagram,
  −
one gets a picture of a "differential vector field".
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                dx dy                |
  −
|                  ^                  |
  −
|                o  |  o                |
  −
|              / \ | / \              |
  −
|              /  \|/  \              |
  −
|            /dy  |  dx\            |
  −
|            /(dx) /|\ (dy)\            |
  −
|          /  ^ /`|`\ ^  \          |
  −
|          /    \``|``/    \          |
  −
|        /    /`\`|`/`\    \        |
  −
|        /    /```\|/```\    \        |
  −
|      o  x  o`````o`````o  y  o      |
  −
|        \    \`````````/    /        |
  −
|        \  o---->```<----o  /        |
  −
|          \  dy \``^``/ dx  /          |
  −
|          \(dx) \`|`/ (dy)/          |
  −
|            \    \|/    /            |
  −
|            \    |    /            |
  −
|              \  /|\  /              |
  −
|              \ / | \ /              |
  −
|                o  |  o                |
  −
|                  |                  |
  −
|                dx | dy                |
  −
|                  o                  |
  −
|                                      |
  −
o---------------------------------------o
  −
 
  −
This really just constitutes a depiction of
  −
the interpretations in EU = X x Y x dX x dY
  −
that satisfy the difference proposition Df,
  −
namely, these:
  −
 
  −
1.  x  y  dx  dy
  −
2.  x  y  dx (dy)
  −
3.  x  y (dx) dy
  −
4.  x (y)(dx) dy
  −
5.  (x) y  dx (dy)
  −
6.  (x)(y) dx  dy
  −
 
  −
By inspection, it is fairly easy to understand Df
  −
as telling you what you have to do from each point
  −
of U in order to change the value borne by f(x, y).
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 4
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
We have been studying the action of the difference operator D,
  −
also known as the "localization operator", on the proposition
  −
f : X x Y -> B that is commonly known as the conjunction x·y.
  −
We described Df as a (first order) differential proposition,
  −
that is, a proposition of the type Df : X x Y x dX x dY -> B.
  −
Abstracting from the augmented venn diagram that illustrates
  −
how the "models", or the "satisfying interpretations", of Df
  −
distribute within the extended universe EU = X x Y x dX x dY,
  −
we can depict Df in the form of a "digraph" or directed graph,
  −
one whose points are labeled with the elements of  U =  X x Y
  −
and whose arrows are labeled with the elements of dU = dX x dY.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                x · y                |
  −
|                                      |
  −
|                  o                  |
  −
|                  ^^^                  |
  −
|                / | \                |
  −
|      (dx)· dy  /  |  \  dx ·(dy)      |
  −
|              /  |  \              |
  −
|              /    |    \              |
  −
|            v    |    v            |
  −
|  x ·(y)  o      |      o  (x)· y  |
  −
|                  |                  |
  −
|                  |                  |
  −
|                dx · dy                |
  −
|                  |                  |
  −
|                  |                  |
  −
|                  v                  |
  −
|                  o                  |
  −
|                                      |
  −
|                (x)·(y)                |
  −
|                                      |
  −
o---------------------------------------o
  −
|                                      |
  −
|  f    =    x  y                      |
  −
|                                      |
  −
| Df    =    x  y  · ((dx)(dy))        |
  −
|                                      |
  −
|      +    x (y) ·  (dx) dy          |
  −
|                                      |
  −
|      +    (x) y  ·  dx (dy)        |
  −
|                                      |
  −
|      +    (x)(y) ·  dx  dy          |
  −
|                                      |
  −
o---------------------------------------o
  −
 
  −
Any proposition worth its salt, as they say,
  −
has many equivalent ways to look at it, any
  −
of which may reveal some unsuspected aspect
  −
of its meaning.  We will encounter more and
  −
more of these alternative readings as we go.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 5
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
The enlargement operator E, also known as the "shift operator",
  −
has many interesting and very useful properties in its own right,
  −
so let us not fail to observe a few of the more salient features
  −
that play out on the surface of our simple example, f(x, y) = xy.
  −
 
  −
Introduce a suitably generic definition of the extended universe of discourse:
  −
 
  −
Let U = X_1 x ... x X_k and EU = U x dU = X_1 x ... x X_k x dX_1 x ... x dX_k.
  −
 
  −
For a proposition f : X_1 x ... x X_k -> B,
  −
the (first order) 'enlargement' of f is the
  −
proposition Ef : EU -> B that is defined by:
  −
 
  −
Ef(x_1, ..., x_k, dx_1, ..., dx_k)  =  f(x_1 + dx_1, ..., x_k + dx_k).
  −
 
  −
It should be noted that the so-called "differential variables" dx_j
  −
are really just the same kind of boolean variables as the other x_j.
  −
It is conventional to give the additional variables these brands of
  −
inflected names, but whatever extra connotations we might choose to
  −
attach to these syntactic conveniences are wholly external to their
  −
purely algebraic meanings.
  −
 
  −
For the example f(x, y) = xy, we obtain:
  −
 
  −
Ef(x, y, dx, dy)  =  (x + dx)(y + dy).
  −
 
  −
Given that this expression uses nothing more than the "boolean ring"
  −
operations of addition (+) and multiplication (·), it is permissible
  −
to "multiply things out" in the usual manner to arrive at the result:
  −
 
  −
Ef(x, y, dx, dy)  =  x·y  +  x·dy  +  y·dx  +  dx·dy.
  −
 
  −
To understand what this means in logical terms, for instance, as expressed
  −
in a boolean expansion or a "disjunctive normal form" (DNF), it is perhaps
  −
a little better to go back and analyze the expression the same way that we
  −
did for Df.  Thus, let us compute the value of the enlarged proposition Ef
  −
at each of the points in the universe of discourse U = X x Y.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|              x  dx y  dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef =      (x, dx)·(y, dy)            |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                dx    dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef|xy =      (dx)·(dy)              |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                    o                  |
  −
|                dx |  dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef|x(y) =    (dx)· dy                |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|              o                        |
  −
|              |  dx    dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef|(x)y =      dx ·(dy)              |
  −
o---------------------------------------o
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|              o    o                  |
  −
|              |  dx |  dy              |
  −
|              o---o o---o              |
  −
|              \  | |  /              |
  −
|                \ | | /                |
  −
|                \| |/                |
  −
|                  @=@                  |
  −
|                                      |
  −
o---------------------------------------o
  −
| Ef|(x)(y) =    dx · dy                |
  −
o---------------------------------------o
  −
 
  −
Given the sort of data that arises from this form of analysis,
  −
we can now fold the disjoined ingredients back into a boolean
  −
expansion or a DNF that is equivalent to the proposition Ef.
  −
 
  −
Ef  =  xy · Ef_xy  +  x(y) · Ef_x(y)  +  (x)y · Ef_(x)y  +  (x)(y) · Ef_(x)(y).
  −
 
  −
Here is a summary of the result, illustrated by means of a digraph picture,
  −
where the "no change" element (dx)(dy) is drawn as a loop at the point x·y.
  −
 
  −
o---------------------------------------o
  −
|                                      |
  −
|                x · y                |
  −
|              (dx)·(dy)              |
  −
|                -->--                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  o                  |
  −
|                  ^^^                  |
  −
|                / | \                |
  −
|                /  |  \                |
  −
|    (dx)· dy  /  |  \  dx ·(dy)    |
  −
|              /    |    \              |
  −
|            /    |    \            |
  −
|  x ·(y)  o      |      o  (x)· y  |
  −
|                  |                  |
  −
|                  |                  |
  −
|                dx · dy                |
  −
|                  |                  |
  −
|                  |                  |
  −
|                  o                  |
  −
|                                      |
  −
|                (x)·(y)                |
  −
|                                      |
  −
o---------------------------------------o
  −
|                                      |
  −
|  f    =    x  y                      |
  −
|                                      |
  −
| Ef    =    x  y  · (dx)(dy)          |
  −
|                                      |
  −
|      +    x (y) · (dx) dy          |
  −
|                                      |
  −
|      +    (x) y  ·  dx (dy)          |
  −
|                                      |
  −
|      +    (x)(y) ·  dx  dy          |
  −
|                                      |
  −
o---------------------------------------o
  −
 
  −
We may understand the enlarged proposition Ef
  −
as telling us all the different ways to reach
  −
a model of f from any point of the universe U.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 6
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
To broaden our experience with simple examples, let us now contemplate the
  −
sixteen functions of concrete type X x Y -> B and abstract type B x B -> B.
  −
For future reference, I will set here a few tables that detail the actions
  −
of E and D and on each of these functions, allowing us to view the results
  −
in several different ways.
  −
 
  −
By way of initial orientation, Table 0 lists equivalent expressions for the
  −
sixteen functions in a number of different languages for zeroth order logic.
  −
 
  −
 
  −
Table 0.  Propositional Forms On Two Variables
  −
o---------o---------o---------o----------o------------------o----------o
  −
| L_1    | L_2    | L_3    | L_4      | L_5              | L_6      |
  −
|        |        |        |          |                  |          |
  −
| Decimal | Binary  | Vector  | Cactus  | English          | Vulgate  |
  −
o---------o---------o---------o----------o------------------o----------o
  −
|        |      x = 1 1 0 0 |          |                  |          |
  −
|        |      y = 1 0 1 0 |          |                  |          |
  −
o---------o---------o---------o----------o------------------o----------o
  −
|        |        |        |          |                  |          |
  −
| f_0    | f_0000  | 0 0 0 0 |    ()    | false            |    0    |
  −
|        |        |        |          |                  |          |
  −
| f_1    | f_0001  | 0 0 0 1 |  (x)(y)  | neither x nor y  | ~x & ~y  |
  −
|        |        |        |          |                  |          |
  −
| f_2    | f_0010  | 0 0 1 0 |  (x) y  | y and not x      | ~x &  y  |
  −
|        |        |        |          |                  |          |
  −
| f_3    | f_0011  | 0 0 1 1 |  (x)    | not x            | ~x      |
  −
|        |        |        |          |                  |          |
  −
| f_4    | f_0100  | 0 1 0 0 |  x (y)  | x and not y      |  x & ~y  |
  −
|        |        |        |          |                  |          |
  −
| f_5    | f_0101  | 0 1 0 1 |    (y)  | not y            |      ~y  |
  −
|        |        |        |          |                  |          |
  −
| f_6    | f_0110  | 0 1 1 0 |  (x, y)  | x not equal to y |  x +  y  |
  −
|        |        |        |          |                  |          |
  −
| f_7    | f_0111  | 0 1 1 1 |  (x  y)  | not both x and y | ~x v ~y  |
  −
|        |        |        |          |                  |          |
  −
| f_8    | f_1000  | 1 0 0 0 |  x  y  | x and y          |  x &  y  |
  −
|        |        |        |          |                  |          |
  −
| f_9    | f_1001  | 1 0 0 1 | ((x, y)) | x equal to y    |  x =  y  |
  −
|        |        |        |          |                  |          |
  −
| f_10    | f_1010  | 1 0 1 0 |      y  | y                |      y  |
  −
|        |        |        |          |                  |          |
  −
| f_11    | f_1011  | 1 0 1 1 |  (x (y)) | not x without y  |  x => y  |
  −
|        |        |        |          |                  |          |
  −
| f_12    | f_1100  | 1 1 0 0 |  x      | x                |  x      |
  −
|        |        |        |          |                  |          |
  −
| f_13    | f_1101  | 1 1 0 1 | ((x) y)  | not y without x  |  x <= y  |
  −
|        |        |        |          |                  |          |
  −
| f_14    | f_1110  | 1 1 1 0 | ((x)(y)) | x or y          |  x v  y  |
  −
|        |        |        |          |                  |          |
  −
| f_15    | f_1111  | 1 1 1 1 |  (())  | true            |    1    |
  −
|        |        |        |          |                  |          |
  −
o---------o---------o---------o----------o------------------o----------o
  −
 
  −
 
  −
The next four Tables expand the expressions of Ef and Df
  −
in two different ways, for each of the sixteen functions.
  −
Notice that the functions are given in a different order,
  −
here being collected into a set of seven natural classes.
  −
 
  −
 
  −
Table 1.  Ef Expanded Over Ordinary Features {x, y}
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
|      |    f      |  Ef | xy  | Ef | x(y)  | Ef | (x)y  | Ef | (x)(y)|
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_0  |    ()    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_1  |  (x)(y)  |  dx  dy  |  dx (dy)  |  (dx) dy  |  (dx)(dy)  |
  −
|      |            |            |            |            |            |
  −
| f_2  |  (x) y    |  dx (dy)  |  dx  dy  |  (dx)(dy)  |  (dx) dy  |
  −
|      |            |            |            |            |            |
  −
| f_4  |    x (y)  |  (dx) dy  |  (dx)(dy)  |  dx  dy  |  dx (dy)  |
  −
|      |            |            |            |            |            |
  −
| f_8  |    x  y    |  (dx)(dy)  |  (dx) dy  |  dx (dy)  |  dx  dy  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_3  |  (x)      |  dx      |  dx      |  (dx)      |  (dx)      |
  −
|      |            |            |            |            |            |
  −
| f_12 |    x      |  (dx)      |  (dx)      |  dx      |  dx      |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_6  |  (x, y)  |  (dx, dy)  | ((dx, dy)) | ((dx, dy)) |  (dx, dy)  |
  −
|      |            |            |            |            |            |
  −
| f_9  |  ((x, y))  | ((dx, dy)) |  (dx, dy)  |  (dx, dy)  | ((dx, dy)) |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_5  |      (y)  |      dy  |      (dy)  |      dy  |      (dy)  |
  −
|      |            |            |            |            |            |
  −
| f_10 |      y    |      (dy)  |      dy  |      (dy)  |      dy  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_7  |  (x  y)  | ((dx)(dy)) | ((dx) dy)  |  (dx (dy)) |  (dx  dy)  |
  −
|      |            |            |            |            |            |
  −
| f_11 |  (x (y))  | ((dx) dy)  | ((dx)(dy)) |  (dx  dy)  |  (dx (dy)) |
  −
|      |            |            |            |            |            |
  −
| f_13 |  ((x) y)  |  (dx (dy)) |  (dx  dy)  | ((dx)(dy)) | ((dx) dy)  |
  −
|      |            |            |            |            |            |
  −
| f_14 |  ((x)(y))  |  (dx  dy)  |  (dx (dy)) | ((dx) dy)  | ((dx)(dy)) |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_15 |    (())    |    (())    |    (())    |    (())    |    (())    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
 
  −
 
  −
Table 2.  Df Expanded Over Ordinary Features {x, y}
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
|      |    f      |  Df | xy  | Df | x(y)  | Df | (x)y  | Df | (x)(y)|
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_0  |    ()    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_1  |  (x)(y)  |  dx  dy  |  dx (dy)  |  (dx) dy  | ((dx)(dy)) |
  −
|      |            |            |            |            |            |
  −
| f_2  |  (x) y    |  dx (dy)  |  dx  dy  | ((dx)(dy)) |  (dx) dy  |
  −
|      |            |            |            |            |            |
  −
| f_4  |    x (y)  |  (dx) dy  | ((dx)(dy)) |  dx  dy  |  dx (dy)  |
  −
|      |            |            |            |            |            |
  −
| f_8  |    x  y    | ((dx)(dy)) |  (dx) dy  |  dx (dy)  |  dx  dy  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_3  |  (x)      |  dx      |  dx      |  dx      |  dx      |
  −
|      |            |            |            |            |            |
  −
| f_12 |    x      |  dx      |  dx      |  dx      |  dx      |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_6  |  (x, y)  |  (dx, dy)  |  (dx, dy)  |  (dx, dy)  |  (dx, dy)  |
  −
|      |            |            |            |            |            |
  −
| f_9  |  ((x, y))  |  (dx, dy)  |  (dx, dy)  |  (dx, dy)  |  (dx, dy)  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_5  |      (y)  |      dy  |      dy  |      dy  |      dy  |
  −
|      |            |            |            |            |            |
  −
| f_10 |      y    |      dy  |      dy  |      dy  |      dy  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_7  |  (x  y)  | ((dx)(dy)) |  (dx) dy  |  dx (dy)  |  dx  dy  |
  −
|      |            |            |            |            |            |
  −
| f_11 |  (x (y))  |  (dx) dy  | ((dx)(dy)) |  dx  dy  |  dx (dy)  |
  −
|      |            |            |            |            |            |
  −
| f_13 |  ((x) y)  |  dx (dy)  |  dx  dy  | ((dx)(dy)) |  (dx) dy  |
  −
|      |            |            |            |            |            |
  −
| f_14 |  ((x)(y))  |  dx  dy  |  dx (dy)  |  (dx) dy  | ((dx)(dy)) |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_15 |    (())    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
 
  −
 
  −
Table 3.  Ef Expanded Over Differential Features {dx, dy}
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
|      |    f      |  T_11 f  |  T_10 f  |  T_01 f  |  T_00 f  |
  −
|      |            |            |            |            |            |
  −
|      |            | Ef| dx·dy  | Ef| dx(dy) | Ef| (dx)dy | Ef|(dx)(dy)|
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_0  |    ()    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_1  |  (x)(y)  |    x  y    |    x (y)  |  (x) y    |  (x)(y)  |
  −
|      |            |            |            |            |            |
  −
| f_2  |  (x) y    |    x (y)  |    x  y    |  (x)(y)  |  (x) y    |
  −
|      |            |            |            |            |            |
  −
| f_4  |    x (y)  |  (x) y    |  (x)(y)  |    x  y    |    x (y)  |
  −
|      |            |            |            |            |            |
  −
| f_8  |    x  y    |  (x)(y)  |  (x) y    |    x (y)  |    x  y    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_3  |  (x)      |    x      |    x      |  (x)      |  (x)      |
  −
|      |            |            |            |            |            |
  −
| f_12 |    x      |  (x)      |  (x)      |    x      |    x      |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_6  |  (x, y)  |  (x, y)  |  ((x, y))  |  ((x, y))  |  (x, y)  |
  −
|      |            |            |            |            |            |
  −
| f_9  |  ((x, y))  |  ((x, y))  |  (x, y)  |  (x, y)  |  ((x, y))  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_5  |      (y)  |      y    |      (y)  |      y    |      (y)  |
  −
|      |            |            |            |            |            |
  −
| f_10 |      y    |      (y)  |      y    |      (y)  |      y    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_7  |  (x  y)  |  ((x)(y))  |  ((x) y)  |  (x (y))  |  (x  y)  |
  −
|      |            |            |            |            |            |
  −
| f_11 |  (x (y))  |  ((x) y)  |  ((x)(y))  |  (x  y)  |  (x (y))  |
  −
|      |            |            |            |            |            |
  −
| f_13 |  ((x) y)  |  (x (y))  |  (x  y)  |  ((x)(y))  |  ((x) y)  |
  −
|      |            |            |            |            |            |
  −
| f_14 |  ((x)(y))  |  (x  y)  |  (x (y))  |  ((x) y)  |  ((x)(y))  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_15 |    (())    |    (())    |    (())    |    (())    |    (())    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|                  |            |            |            |            |
  −
| Fixed Point Total |      4    |      4    |      4    |    16    |
  −
|                  |            |            |            |            |
  −
o-------------------o------------o------------o------------o------------o
  −
 
  −
 
  −
Table 4.  Df Expanded Over Differential Features {dx, dy}
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
|      |    f      | Df| dx·dy  | Df| dx(dy) | Df| (dx)dy | Df|(dx)(dy)|
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_0  |    ()    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_1  |  (x)(y)  |  ((x, y))  |    (y)    |    (x)    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_2  |  (x) y    |  (x, y)  |    y      |    (x)    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_4  |    x (y)  |  (x, y)  |    (y)    |    x      |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_8  |    x  y    |  ((x, y))  |    y      |    x      |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_3  |  (x)      |    (())    |    (())    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_12 |    x      |    (())    |    (())    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_6  |  (x, y)  |    ()    |    (())    |    (())    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_9  |  ((x, y))  |    ()    |    (())    |    (())    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_5  |      (y)  |    (())    |    ()    |    (())    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_10 |      y    |    (())    |    ()    |    (())    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_7  |  (x  y)  |  ((x, y))  |    y      |    x      |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_11 |  (x (y))  |  (x, y)  |    (y)    |    x      |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_13 |  ((x) y)  |  (x, y)  |    y      |    (x)    |    ()    |
  −
|      |            |            |            |            |            |
  −
| f_14 |  ((x)(y))  |  ((x, y))  |    (y)    |    (x)    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_15 |    (())    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
 
  −
 
  −
If the medium truly is the message,
  −
the blank slate is the innate idea.
  −
 
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 7
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
If you think that I linger in the realm of logical difference calculus
  −
out of sheer vacillation about getting down to the differential proper,
  −
it is probably out of a prior expectation that you derive from the art
  −
or the long-engrained practice of real analysis.  But the fact is that
  −
ordinary calculus only rushes on to the sundry orders of approximation
  −
because the strain of comprehending the full import of E and D at once
  −
whelm over its discrete and finite powers to grasp them.  But here, in
  −
the fully serene idylls of ZOL, we find ourselves fit with the compass
  −
of a wit that is all we'd ever wish to explore their effects with care.
  −
 
  −
So let us do just that.
  −
 
  −
I will first rationalize the novel grouping of propositional forms
  −
in the last set of Tables, as that will extend a gentle invitation
  −
to the mathematical subject of "group theory", and demonstrate its
  −
relevance to differential logic in a strikingly apt and useful way.
  −
The data for that account is contained in Table 3.
  −
 
  −
Table 3.  Ef Expanded Over Differential Features {dx, dy}
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
|      |    f      |  T_11 f  |  T_10 f  |  T_01 f  |  T_00 f  |
  −
|      |            |            |            |            |            |
  −
|      |            | Ef| dx·dy  | Ef| dx(dy) | Ef| (dx)dy | Ef|(dx)(dy)|
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_0  |    ()    |    ()    |    ()    |    ()    |    ()    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_1  |  (x)(y)  |    x  y    |    x (y)  |  (x) y    |  (x)(y)  |
  −
|      |            |            |            |            |            |
  −
| f_2  |  (x) y    |    x (y)  |    x  y    |  (x)(y)  |  (x) y    |
  −
|      |            |            |            |            |            |
  −
| f_4  |    x (y)  |  (x) y    |  (x)(y)  |    x  y    |    x (y)  |
  −
|      |            |            |            |            |            |
  −
| f_8  |    x  y    |  (x)(y)  |  (x) y    |    x (y)  |    x  y    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_3  |  (x)      |    x      |    x      |  (x)      |  (x)      |
  −
|      |            |            |            |            |            |
  −
| f_12 |    x      |  (x)      |  (x)      |    x      |    x      |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_6  |  (x, y)  |  (x, y)  |  ((x, y))  |  ((x, y))  |  (x, y)  |
  −
|      |            |            |            |            |            |
  −
| f_9  |  ((x, y))  |  ((x, y))  |  (x, y)  |  (x, y)  |  ((x, y))  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_5  |      (y)  |      y    |      (y)  |      y    |      (y)  |
  −
|      |            |            |            |            |            |
  −
| f_10 |      y    |      (y)  |      y    |      (y)  |      y    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_7  |  (x  y)  |  ((x)(y))  |  ((x) y)  |  (x (y))  |  (x  y)  |
  −
|      |            |            |            |            |            |
  −
| f_11 |  (x (y))  |  ((x) y)  |  ((x)(y))  |  (x  y)  |  (x (y))  |
  −
|      |            |            |            |            |            |
  −
| f_13 |  ((x) y)  |  (x (y))  |  (x  y)  |  ((x)(y))  |  ((x) y)  |
  −
|      |            |            |            |            |            |
  −
| f_14 |  ((x)(y))  |  (x  y)  |  (x (y))  |  ((x) y)  |  ((x)(y))  |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|      |            |            |            |            |            |
  −
| f_15 |    (())    |    (())    |    (())    |    (())    |    (())    |
  −
|      |            |            |            |            |            |
  −
o------o------------o------------o------------o------------o------------o
  −
|                  |            |            |            |            |
  −
| Fixed Point Total |      4    |      4    |      4    |    16    |
  −
|                  |            |            |            |            |
  −
o-------------------o------------o------------o------------o------------o
  −
 
  −
The shift operator E can be understood as enacting a substitution operation
  −
on the proposition that is given as its argument.  In our immediate example,
  −
we have the following data and definition:
  −
 
  −
E : (U -> B)  ->  (EU -> B),
  −
 
  −
E :  f(x, y)  ->  Ef(x, y, dx, dy),
  −
 
  −
Ef(x, y, dx, dy)  =  f(x + dx, y + dy).
  −
 
  −
Therefore, if we evaluate Ef at particular values of dx and dy,
  −
for example, dx = i and dy = j, where i, j are in B, we obtain:
  −
 
  −
E_ij : (U -> B)  ->  (U -> B),
  −
 
  −
E_ij :    f      ->  E_ij f,
  −
 
  −
E_ij f  =  Ef | <dx = i, dy = j>  =  f(x + i, y + j).
  −
 
  −
The notation is a little bit awkward, but the data of the Table should
  −
make the sense clear.  The important thing to observe is that E_ij has
  −
the effect of transforming each proposition f : U -> B into some other
  −
proposition f' : U -> B.  As it happens, the action is one-to-one and
  −
onto for each E_ij, so the gang of four operators {E_ij : i, j in B}
  −
is an example of what is called a "transformation group" on the set
  −
of sixteen propositions.  Bowing to a longstanding local and linear
  −
tradition, I will therefore redub the four elements of this group
  −
as T_00, T_01, T_10, T_11, to bear in mind their transformative
  −
character, or nature, as the case may be.  Abstractly viewed,
  −
this group of order four has the following operation table:
  −
 
  −
o----------o----------o----------o----------o----------o
  −
|          %          |          |          |          |
  −
|    ·    %  T_00  |  T_01  |  T_10  |  T_11  |
  −
|          %          |          |          |          |
  −
o==========o==========o==========o==========o==========o
  −
|          %          |          |          |          |
  −
|  T_00  %  T_00  |  T_01  |  T_10  |  T_11  |
  −
|          %          |          |          |          |
  −
o----------o----------o----------o----------o----------o
  −
|          %          |          |          |          |
  −
|  T_01  %  T_01  |  T_00  |  T_11  |  T_10  |
  −
|          %          |          |          |          |
  −
o----------o----------o----------o----------o----------o
  −
|          %          |          |          |          |
  −
|  T_10  %  T_10  |  T_11  |  T_00  |  T_01  |
  −
|          %          |          |          |          |
  −
o----------o----------o----------o----------o----------o
  −
|          %          |          |          |          |
  −
|  T_11  %  T_11  |  T_10  |  T_01  |  T_00  |
  −
|          %          |          |          |          |
  −
o----------o----------o----------o----------o----------o
  −
 
  −
It happens that there are just two possible groups of 4 elements.
  −
One is the cyclic group Z_4 (German "Zyklus"), which this is not.
  −
The other is Klein's four-group V_4 (German "Vier"), which it is.
  −
 
  −
More concretely viewed, the group as a whole pushes the set
  −
of sixteen propositions around in such a way that they fall
  −
into seven natural classes, called "orbits".  One says that
  −
the orbits are preserved by the action of the group.  There
  −
is an "Orbit Lemma" of immense utility to "those who count"
  −
which, depending on your upbringing, you may associate with
  −
the names of Burnside, Cauchy, Frobenius, or some subset or
  −
superset of these three, vouching that the number of orbits
  −
is equal to the mean number of fixed points, in other words,
  −
the total number of points (in our case, propositions) that
  −
are left unmoved by the separate operations, divided by the
  −
order of the group.  In this instance, T_00 operates as the
  −
group identity, fixing all 16 propositions, while the other
  −
three group elements fix 4 propositions each, and so we get:
  −
Number of orbits  =  (4 + 4 + 4 + 16) / 4  =  7.  Amazing!
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 8
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
We have been contemplating functions of the type f : U -> B
  −
studying the action of the operators E and D on this family.
  −
These functions, that we may identify for our present aims
  −
with propositions, inasmuch as they capture their abstract
  −
forms, are logical analogues of "scalar potential fields".
  −
These are the sorts of fields that are so picturesquely
  −
presented in elementary calculus and physics textbooks
  −
by images of snow-covered hills and parties of skiers
  −
who trek down their slopes like least action heroes.
  −
The analogous scene in propositional logic presents
  −
us with forms more reminiscent of plateaunic idylls,
  −
being all plains at one of two levels, the mesas of
  −
verity and falsity, as it were, with nary a niche
  −
to inhabit between them, restricting our options
  −
for a sporting gradient of downhill dynamics to
  −
just one of two, standing still on level ground
  −
or falling off a bluff.
  −
 
  −
We are still working well within the logical analogue of the
  −
classical finite difference calculus, taking in the novelties
  −
that the logical transmutation of familiar elements is able to
  −
bring to light.  Soon we will take up several different notions
  −
of approximation relationships that may be seen to organize the
  −
space of propositions, and these will allow us to define several
  −
different forms of differential analysis applying to propositions.
  −
In time we will find reason to consider more general types of maps,
  −
having concrete types of the form X_1 x ... x X_k -> Y_1 x ... x Y_n
  −
and abstract types B^k -> B^n.  We will think of these mappings as
  −
transforming universes of discourse into themselves or into others,
  −
in short, as "transformations of discourse".
  −
 
  −
Before we continue with this intinerary, however, I would like to highlight
  −
another sort of "differential aspect" that concerns the "boundary operator"
  −
or the "marked connective" that serves as one of the two basic connectives
  −
in the cactus language for ZOL.
  −
 
  −
For example, consider the proposition f of concrete type f : X x Y x Z -> B
  −
and abstract type f : B^3 -> B that is written "(x, y, z)" in cactus syntax.
  −
Taken as an assertion in what Peirce called the "existential interpretation",
  −
(x, y, z) says that just one of x, y, z is false.  It is useful to consider
  −
this assertion in relation to the conjunction xyz of the features that are
  −
engaged as its arguments.  A venn diagram of (x, y, z) looks like this:
  −
 
  −
o-----------------------------------------------------------o
  −
| U                                                        |
  −
|                                                          |
  −
|                      o-------------o                      |
  −
|                    /              \                    |
  −
|                    /                \                    |
  −
|                  /                  \                  |
  −
|                  /                    \                  |
  −
|                /                      \                |
  −
|                o            x            o                |
  −
|                |                        |                |
  −
|                |                        |                |
  −
|                |                        |                |
  −
|                |                        |                |
  −
|                |                        |                |
  −
|            o--o----------o  o----------o--o            |
  −
|            /    \%%%%%%%%%%\ /%%%%%%%%%%/    \            |
  −
|          /      \%%%%%%%%%%o%%%%%%%%%%/      \          |
  −
|          /        \%%%%%%%%/ \%%%%%%%%/        \          |
  −
|        /          \%%%%%%/  \%%%%%%/          \        |
  −
|        /            \%%%%/    \%%%%/            \        |
  −
|      o              o--o-------o--o              o      |
  −
|      |                |%%%%%%%|                |      |
  −
|      |                |%%%%%%%|                |      |
  −
|      |                |%%%%%%%|                |      |
  −
|      |                |%%%%%%%|                |      |
  −
|      |                |%%%%%%%|                |      |
  −
|      o        y        o%%%%%%%o        z        o      |
  −
|        \                \%%%%%/                /        |
  −
|        \                \%%%/                /        |
  −
|          \                \%/                /          |
  −
|          \                o                /          |
  −
|            \              / \              /            |
  −
|            o-------------o  o-------------o            |
  −
|                                                          |
  −
|                                                          |
  −
o-----------------------------------------------------------o
  −
 
  −
In relation to the center cell indicated by the conjunction xyz,
  −
the region indicated by (x, y, z) is comprised of the "adjacent"
  −
or the "bordering" cells.  Thus they are the cells that are just
  −
across the boundary of the center cell, as if reached by way of
  −
Leibniz's "minimal changes" from the point of origin, here, xyz.
  −
 
  −
The same sort of boundary relationship holds for any cell of origin that
  −
one might elect to indicate, say, by means of the conjunction of positive
  −
or negative basis features u_1 · ... · u_k, with u_j = x_j or u_j = (x_j),
  −
for j = 1 to k.  The proposition (u_1, ..., u_k) indicates the disjunctive
  −
region consisting of the cells that are just next door to u_1 · ... · u_k.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 9
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might conceivably have
  −
| practical bearings you conceive the objects of your
  −
| conception to have.  Then, your conception of those
  −
| effects is the whole of your conception of the object.
  −
|
  −
| Charles Sanders Peirce, "The Maxim of Pragmatism, CP 5.438.
  −
 
  −
One other subject that it would be opportune to mention at this point,
  −
while we have an object example of a mathematical group fresh in mind,
  −
is the relationship between the pragmatic maxim and what are commonly
  −
known in mathematics as "representation principles".  As it turns out,
  −
with regard to its formal characteristics, the pragmatic maxim unites
  −
the aspects of a representation principle with the attributes of what
  −
would ordinarily be known as a "closure principle".  We will consider
  −
the form of closure that is invoked by the pragmatic maxim on another
  −
occasion, focusing here and now on the topic of group representations.
  −
 
  −
Let us return to the example of the so-called "four-group" V_4.
  −
We encountered this group in one of its concrete representations,
  −
namely, as a "transformation group" that acts on a set of objects,
  −
in this particular case a set of sixteen functions or propositions.
  −
Forgetting about the set of objects that the group transforms among
  −
themselves, we may take the abstract view of the group's operational
  −
structure, say, in the form of the group operation table copied here:
  −
 
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    ·    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o=========o=========o=========o=========o=========o
  −
|        %        |        |        |        |
  −
|    e    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    f    %    f    |    e    |    h    |    g    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    g    %    g    |    h    |    e    |    f    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    h    %    h    |    g    |    f    |    e    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
 
  −
This table is abstractly the same as, or isomorphic to, the versions with
  −
the E_ij operators and the T_ij transformations that we discussed earlier.
  −
That is to say, the story is the same -- only the names have been changed.
  −
An abstract group can have a multitude of significantly and superficially
  −
different representations.  Even after we have long forgotten the details
  −
of the particular representation that we may have come in with, there are
  −
species of concrete representations, called the "regular representations",
  −
that are always readily available, as they can be generated from the mere
  −
data of the abstract operation table itself.
  −
 
  −
For example, select a group element from the top margin of the Table,
  −
and "consider its effects" on each of the group elements as they are
  −
listed along the left margin.  We may record these effects as Peirce
  −
usually did, as a logical "aggregate" of elementary dyadic relatives,
  −
that is to say, a disjunction or a logical sum whose terms represent
  −
the ordered pairs of <input : output> transactions that are produced
  −
by each group element in turn.  This yields what is usually known as
  −
one of the "regular representations" of the group, specifically, the
  −
"first", the "post-", or the "right" regular representation.  It has
  −
long been conventional to organize the terms in the form of a matrix:
  −
 
  −
Reading "+" as a logical disjunction:
  −
 
  −
G  =  e  +  f  +  g  + h,
  −
 
  −
And so, by expanding effects, we get:
  −
 
  −
G  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
  +  e:f  +  f:e  +  g:h  +  h:g
  −
 
  −
  +  e:g  +  f:h  +  g:e  +  h:f
  −
 
  −
  +  e:h  +  f:g  +  g:f  +  h:e
  −
 
  −
More on the pragmatic maxim as a representation principle later.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 10
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might conceivably have
  −
| practical bearings you conceive the objects of your
  −
| conception to have.  Then, your conception of those
  −
| effects is the whole of your conception of the object.
  −
|
  −
| Charles Sanders Peirce, "The Maxim of Pragmatism, CP 5.438.
  −
 
  −
The genealogy of this conception of pragmatic representation is very intricate.
  −
I will delineate some details that I presently fancy I remember clearly enough,
  −
subject to later correction.  Without checking historical accounts, I will not
  −
be able to pin down anything like a real chronology, but most of these notions
  −
were standard furnishings of the 19th Century mathematical study, and only the
  −
last few items date as late as the 1920's.
  −
 
  −
The idea about the regular representations of a group is universally known
  −
as "Cayley's Theorem", usually in the form:  "Every group is isomorphic to
  −
a subgroup of Aut(S), the group of automorphisms of an appropriate set S".
  −
There is a considerable generalization of these regular representations to
  −
a broad class of relational algebraic systems in Peirce's earliest papers.
  −
The crux of the whole idea is this:
  −
 
  −
| Consider the effects of the symbol, whose meaning you wish to investigate,
  −
| as they play out on "all" of the different stages of context on which you
  −
| can imagine that symbol playing a role.
  −
 
  −
This idea of contextual definition is basically the same as Jeremy Bentham's
  −
notion of "paraphrasis", a "method of accounting for fictions by explaining
  −
various purported terms away" (Quine, in Van Heijenoort, page 216).  Today
  −
we'd call these constructions "term models".  This, again, is the big idea
  −
behind Schönfinkel's combinators {S, K, I}, and hence of lambda calculus,
  −
and I reckon you know where that leads.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 11
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
Continuing to draw on the reduced example of group representations,
  −
I would like to draw out a few of the finer points and problems of
  −
regarding the maxim of pragmatism as a principle of representation.
  −
 
  −
Let us revisit the example of an abstract group that we had befour:
  −
 
  −
Table 1.  Klein Four-Group V_4
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    ·    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o=========o=========o=========o=========o=========o
  −
|        %        |        |        |        |
  −
|    e    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    f    %    f    |    e    |    h    |    g    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    g    %    g    |    h    |    e    |    f    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    h    %    h    |    g    |    f    |    e    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
 
  −
I presented the regular post-representation
  −
of the four-group V_4 in the following form:
  −
 
  −
Reading "+" as a logical disjunction:
  −
 
  −
  G  =  e  +  f  +  g  + h
  −
 
  −
And so, by expanding effects, we get:
  −
 
  −
  G  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
      +  e:f  +  f:e  +  g:h  +  h:g
  −
 
  −
      +  e:g  +  f:h  +  g:e  +  h:f
  −
 
  −
      +  e:h  +  f:g  +  g:f  +  h:e
  −
 
  −
This presents the group in one big bunch,
  −
and there are occasions when one regards
  −
it this way, but that is not the typical
  −
form of presentation that we'd encounter.
  −
More likely, the story would go a little
  −
bit like this:
  −
 
  −
I cannot remember any of my math teachers
  −
ever invoking the pragmatic maxim by name,
  −
but it would be a very regular occurrence
  −
for such mentors and tutors to set up the
  −
subject in this wise:  Suppose you forget
  −
what a given abstract group element means,
  −
that is, in effect, 'what it is'.  Then a
  −
sure way to jog your sense of 'what it is'
  −
is to build a regular representation from
  −
the formal materials that are necessarily
  −
left lying about on that abstraction site.
  −
 
  −
Working through the construction for each
  −
one of the four group elements, we arrive
  −
at the following exegeses of their senses,
  −
giving their regular post-representations:
  −
 
  −
  e  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
  f  =  e:f  +  f:e  +  g:h  +  h:g
  −
 
  −
  g  =  e:g  +  f:h  +  g:e  +  h:f
  −
 
  −
  h  =  e:h  +  f:g  +  g:f  +  h:e
  −
 
  −
So if somebody asks you, say, "What is g?",
  −
you can say, "I don't know for certain but
  −
in practice its effects go a bit like this:
  −
Converting e to g, f to h, g to e, h to f".
  −
 
  −
I will have to check this out later on, but my impression is
  −
that Peirce tended to lean toward the other brand of regular,
  −
the "second", the "left", or the "ante-representation" of the
  −
groups that he treated in his earliest manuscripts and papers.
  −
I believe that this was because he thought of the actions on
  −
the pattern of dyadic relative terms like the "aftermath of".
  −
 
  −
Working through this alternative for each
  −
one of the four group elements, we arrive
  −
at the following exegeses of their senses,
  −
giving their regular ante-representations:
  −
 
  −
  e  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
  f  =  f:e  +  e:f  +  h:g  +  g:h
  −
 
  −
  g  =  g:e  +  h:f  +  e:g  +  f:h
  −
 
  −
  h  =  h:e  +  g:f  +  f:g  +  e:h
  −
 
  −
Your paraphrastic interpretation of what this all
  −
means would come out precisely the same as before.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 12
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Erratum
  −
 
  −
Oops!  I think that I have just confounded two entirely different issues:
  −
1.  The substantial difference between right and left regular representations.
  −
2.  The inessential difference between two conventions of presenting matrices.
  −
I will sort this out and correct it later, as need be.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 13
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
Let me return to Peirce's early papers on the algebra of relatives
  −
to pick up the conventions that he used there, and then rewrite my
  −
account of regular representations in a way that conforms to those.
  −
 
  −
Peirce expresses the action of an "elementary dual relative" like so:
  −
 
  −
| [Let] A:B be taken to denote
  −
| the elementary relative which
  −
| multiplied into B gives A.
  −
|
  −
| Peirce, 'Collected Papers', CP 3.123.
  −
 
  −
And though he is well aware that it is not at all necessary to arrange
  −
elementary relatives into arrays, matrices, or tables, when he does so
  −
he tends to prefer organizing dyadic relations in the following manner:
  −
 
  −
|  A:A  A:B  A:C  |
  −
|                  |
  −
|  B:A  B:B  B:C  |
  −
|                  |
  −
|  C:A  C:B  C:C  |
  −
 
  −
That conforms to the way that the last school of thought
  −
I matriculated into stipulated that we tabulate material:
  −
 
  −
|  e_11  e_12  e_13  |
  −
|                    |
  −
|  e_21  e_22  e_23  |
  −
|                    |
  −
|  e_31  e_32  e_33  |
  −
 
  −
So, for example, let us suppose that we have the small universe {A, B, C},
  −
and the 2-adic relation m = "mover of" that is represented by this matrix:
  −
 
  −
m  =
  −
 
  −
|  m_AA (A:A)  m_AB (A:B)  m_AC (A:C)  |
  −
|                                        |
  −
|  m_BA (B:A)  m_BB (B:B)  m_BC (B:C)  |
  −
|                                        |
  −
|  m_CA (C:A)  m_CB (C:B)  m_CC (C:C)  |
  −
 
  −
Also, let m be such that
  −
A is a mover of A and B,
  −
B is a mover of B and C,
  −
C is a mover of C and A.
  −
 
  −
In sum:
  −
 
  −
m  =
  −
 
  −
|  1 · (A:A)  1 · (A:B)  0 · (A:C)  |
  −
|                                    |
  −
|  0 · (B:A)  1 · (B:B)  1 · (B:C)  |
  −
|                                    |
  −
|  1 · (C:A)  0 · (C:B)  1 · (C:C)  |
  −
 
  −
For the sake of orientation and motivation,
  −
compare with Peirce's notation in CP 3.329.
  −
 
  −
I think that will serve to fix notation
  −
and set up the remainder of the account.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 14
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
I am beginning to see how I got confused.
  −
It is common in algebra to switch around
  −
between different conventions of display,
  −
as the momentary fancy happens to strike,
  −
and I see that Peirce is no different in
  −
this sort of shiftiness than anyone else.
  −
A changeover appears to occur especially
  −
whenever he shifts from logical contexts
  −
to algebraic contexts of application.
  −
 
  −
In the paper "On the Relative Forms of Quaternions" (CP 3.323),
  −
we observe Peirce providing the following sorts of explanation:
  −
 
  −
| If X, Y, Z denote the three rectangular components of a vector, and W denote
  −
| numerical unity (or a fourth rectangular component, involving space of four
  −
| dimensions), and (Y:Z) denote the operation of converting the Y component
  −
| of a vector into its Z component, then
  −
|
  −
|    1  =  (W:W) + (X:X) + (Y:Y) + (Z:Z)
  −
|
  −
|    i  =  (X:W) - (W:X) - (Y:Z) + (Z:Y)
  −
|
  −
|    j  =  (Y:W) - (W:Y) - (Z:X) + (X:Z)
  −
|
  −
|    k  =  (Z:W) - (W:Z) - (X:Y) + (Y:X)
  −
|
  −
| In the language of logic (Y:Z) is a relative term whose relate is
  −
| a Y component, and whose correlate is a Z component.  The law of
  −
| multiplication is plainly (Y:Z)(Z:X) = (Y:X), (Y:Z)(X:W) = 0,
  −
| and the application of these rules to the above values of
  −
| 1, i, j, k gives the quaternion relations
  −
|
  −
|    i^2  =  j^2  =  k^2  =  -1,
  −
|
  −
|    ijk  =  -1,
  −
|
  −
|    etc.
  −
|
  −
| The symbol a(Y:Z) denotes the changing of Y to Z and the
  −
| multiplication of the result by 'a'.  If the relatives be
  −
| arranged in a block
  −
|
  −
|    W:W    W:X    W:Y    W:Z
  −
|
  −
|    X:W    X:X    X:Y    X:Z
  −
|
  −
|    Y:W    Y:X    Y:Y    Y:Z
  −
|
  −
|    Z:W    Z:X    Z:Y    Z:Z
  −
|
  −
| then the quaternion w + xi + yj + zk
  −
| is represented by the matrix of numbers
  −
|
  −
|    w      -x      -y      -z
  −
|
  −
|    x        w      -z      y
  −
|
  −
|    y        z      w      -x
  −
|
  −
|    z      -y      x      w
  −
|
  −
| The multiplication of such matrices follows the same laws as the
  −
| multiplication of quaternions.  The determinant of the matrix =
  −
| the fourth power of the tensor of the quaternion.
  −
|
  −
| The imaginary x + y(-1)^(1/2) may likewise be represented by the matrix
  −
|
  −
|      x      y
  −
|
  −
|    -y      x
  −
|
  −
| and the determinant of the matrix = the square of the modulus.
  −
|
  −
| Charles Sanders Peirce, 'Collected Papers', CP 3.323.
  −
|'Johns Hopkins University Circulars', No. 13, p. 179, 1882.
  −
 
  −
This way of talking is the mark of a person who opts
  −
to multiply his matrices "on the rignt", as they say.
  −
Yet Peirce still continues to call the first element
  −
of the ordered pair (I:J) its "relate" while calling
  −
the second element of the pair (I:J) its "correlate".
  −
That doesn't comport very well, so far as I can tell,
  −
with his customary reading of relative terms, suited
  −
more to the multiplication of matrices "on the left".
  −
 
  −
So I still have a few wrinkles to iron out before
  −
I can give this story a smooth enough consistency.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 15
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
I have been planning for quite some time now to make my return to Peirce's
  −
skyshaking "Description of a Notation for the Logic of Relatives" (1870),
  −
and I can see that it's just about time to get down tuit, so let this
  −
current bit of rambling inquiry function as the preamble to that.
  −
All we need at the present, though, is a modus vivendi/operandi
  −
for telling what is substantial from what is inessential in
  −
the brook between symbolic conceits and dramatic actions
  −
that we find afforded by means of the pragmatic maxim.
  −
 
  −
Back to our "subinstance", the example in support of our first example.
  −
I will now reconstruct it in a way that may prove to be less confusing.
  −
 
  −
Let us make up the model universe $1$ = A + B + C and the 2-adic relation
  −
n = "noder of", as when "X is a data record that contains a pointer to Y".
  −
That interpretation is not important, it's just for the sake of intuition.
  −
In general terms, the 2-adic relation n can be represented by this matrix:
  −
 
  −
n  =
  −
 
  −
|  n_AA (A:A)  n_AB (A:B)  n_AC (A:C)  |
  −
|                                        |
  −
|  n_BA (B:A)  n_BB (B:B)  n_BC (B:C)  |
  −
|                                        |
  −
|  n_CA (C:A)  n_CB (C:B)  n_CC (C:C)  |
  −
 
  −
Also, let n be such that
  −
A is a noder of A and B,
  −
B is a noder of B and C,
  −
C is a noder of C and A.
  −
 
  −
Filling in the instantial values of the "coefficients" n_ij,
  −
as the indices i and j range over the universe of discourse:
  −
 
  −
n  =
  −
 
  −
|  1 · (A:A)  1 · (A:B)  0 · (A:C)  |
  −
|                                    |
  −
|  0 · (B:A)  1 · (B:B)  1 · (B:C)  |
  −
|                                    |
  −
|  1 · (C:A)  0 · (C:B)  1 · (C:C)  |
  −
 
  −
In Peirce's time, and even in some circles of mathematics today,
  −
the information indicated by the elementary relatives (I:J), as
  −
I, J range over the universe of discourse, would be referred to
  −
as the "umbral elements" of the algebraic operation represented
  −
by the matrix, though I seem to recall that Peirce preferred to
  −
call these terms the "ingredients".  When this ordered basis is
  −
understood well enough, one will tend to drop any mention of it
  −
from the matrix itself, leaving us nothing but these bare bones:
  −
 
  −
n  =
  −
 
  −
|  1  1  0  |
  −
|          |
  −
|  0  1  1  |
  −
|          |
  −
|  1  0  1  |
  −
 
  −
However the specification may come to be written, this
  −
is all just convenient schematics for stipulating that:
  −
 
  −
n  =  A:A  +  B:B  +  C:C  +  A:B  +  B:C  +  C:A
  −
 
  −
Recognizing !1! = A:A + B:B + C:C to be the identity transformation,
  −
the 2-adic relation n = "noder of" may be represented by an element
  −
!1! + A:B + B:C + C:A of the so-called "group ring", all of which
  −
just makes this element a special sort of linear transformation.
  −
 
  −
Up to this point, we are still reading the elementary relatives of
  −
the form I:J in the way that Peirce reads them in logical contexts:
  −
I is the relate, J is the correlate, and in our current example we
  −
read I:J, or more exactly, n_ij = 1, to say that I is a noder of J.
  −
This is the mode of reading that we call "multiplying on the left".
  −
 
  −
In the algebraic, permutational, or transformational contexts of
  −
application, however, Peirce converts to the alternative mode of
  −
reading, although still calling I the relate and J the correlate,
  −
the elementary relative I:J now means that I gets changed into J.
  −
In this scheme of reading, the transformation A:B + B:C + C:A is
  −
a permutation of the aggregate $1$ = A + B + C, or what we would
  −
now call the set {A, B, C}, in particular, it is the permutation
  −
that is otherwise notated as:
  −
 
  −
( A B C )
  −
<      >
  −
( B C A )
  −
 
  −
This is consistent with the convention that Peirce uses in
  −
the paper "On a Class of Multiple Algebras" (CP 3.324-327).
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 16
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
We have been contemplating the virtues and the utilities of
  −
the pragmatic maxim as a hermeneutic heuristic, specifically,
  −
as a principle of interpretation that guides us in finding a
  −
clarifying representation for a problematic corpus of symbols
  −
in terms of their actions on other symbols or their effects on
  −
the syntactic contexts in which we conceive to distribute them.
  −
I started off considering the regular representations of groups
  −
as constituting what appears to be one of the simplest possible
  −
applications of this overall principle of representation.
  −
 
  −
There are a few problems of implementation that have to be worked out
  −
in practice, most of which are cleared up by keeping in mind which of
  −
several possible conventions we have chosen to follow at a given time.
  −
But there does appear to remain this rather more substantial question:
  −
 
  −
Are the effects we seek relates or correlates, or does it even matter?
  −
 
  −
I will have to leave that question as it is for now,
  −
in hopes that a solution will evolve itself in time.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 17
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
There a big reasons and little reasons for caring about this humble example.
  −
The little reasons we find all under our feet.  One big reason I can now
  −
quite blazonly enounce in the fashion of this not so subtle subtitle:
  −
 
  −
Obstacles to Applying the Pragmatic Maxim
  −
 
  −
No sooner do you get a good idea and try to apply it
  −
than you find that a motley array of obstacles arise.
  −
 
  −
It seems as if I am constantly lamenting the fact these days that people,
  −
and even admitted Peircean persons, do not in practice more consistently
  −
apply the maxim of pragmatism to the purpose for which it is purportedly
  −
intended by its author.  That would be the clarification of concepts, or
  −
intellectual symbols, to the point where their inherent senses, or their
  −
lacks thereof, would be rendered manifest to all and sundry interpreters.
  −
 
  −
There are big obstacles and little obstacles to applying the pragmatic maxim.
  −
In good subgoaling fashion, I will merely mention a few of the bigger blocks,
  −
as if in passing, and then get down to the devilish details that immediately
  −
obstruct our way.
  −
 
  −
Obstacle 1.  People do not always read the instructions very carefully.
  −
There is a tendency in readers of particular prior persuasions to blow
  −
the problem all out of proportion, to think that the maxim is meant to
  −
reveal the absolutely positive and the totally unique meaning of every
  −
preconception to which they might deign or elect to apply it.  Reading
  −
the maxim with an even minimal attention, you can see that it promises
  −
no such finality of unindexed sense, but ties what you conceive to you.
  −
I have lately come to wonder at the tenacity of this misinterpretation.
  −
Perhaps people reckon that nothing less would be worth their attention.
  −
I am not sure.  I can only say the achievement of more modest goals is
  −
the sort of thing on which our daily life depends, and there can be no
  −
final end to inquiry nor any ultimate community without a continuation
  −
of life, and that means life on a day to day basis.  All of which only
  −
brings me back to the point of persisting with local meantime examples,
  −
because if we can't apply the maxim there, we can't apply it anywhere.
  −
 
  −
And now I need to go out of doors and weed my garden for a time ...
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 18
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
Obstacles to Applying the Pragmatic Maxim
  −
 
  −
Obstacle 2.  Applying the pragmatic maxim, even with a moderate aim, can be hard.
  −
I think that my present example, deliberately impoverished as it is, affords us
  −
with an embarassing richness of evidence of just how complex the simple can be.
  −
 
  −
All the better reason for me to see if I can finish it up before moving on.
  −
 
  −
Expressed most simply, the idea is to replace the question of "what it is",
  −
which modest people know is far too difficult for them to answer right off,
  −
with the question of "what it does", which most of us know a modicum about.
  −
 
  −
In the case of regular representations of groups we found
  −
a non-plussing surplus of answers to sort our way through.
  −
So let us track back one more time to see if we can learn
  −
any lessons that might carry over to more realistic cases.
  −
 
  −
Here is is the operation table of V_4 once again:
  −
 
  −
Table 1.  Klein Four-Group V_4
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    ·    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o=========o=========o=========o=========o=========o
  −
|        %        |        |        |        |
  −
|    e    %    e    |    f    |    g    |    h    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    f    %    f    |    e    |    h    |    g    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    g    %    g    |    h    |    e    |    f    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
|        %        |        |        |        |
  −
|    h    %    h    |    g    |    f    |    e    |
  −
|        %        |        |        |        |
  −
o---------o---------o---------o---------o---------o
  −
 
  −
A group operation table is really just a device for
  −
recording a certain 3-adic relation, to be specific,
  −
the set of triples of the form <x, y, z> satisfying
  −
the equation x·y = z where · is the group operation.
  −
 
  −
In the case of V_4 = (G, ·), where G is the "underlying set"
  −
{e, f, g, h}, we have the 3-adic relation L(V_4) c G x G x G
  −
whose triples are listed below:
  −
 
  −
|  <e, e, e>
  −
|  <e, f, f>
  −
|  <e, g, g>
  −
|  <e, h, h>
  −
|
  −
|  <f, e, f>
  −
|  <f, f, e>
  −
|  <f, g, h>
  −
|  <f, h, g>
  −
|
  −
|  <g, e, g>
  −
|  <g, f, h>
  −
|  <g, g, e>
  −
|  <g, h, f>
  −
|
  −
|  <h, e, h>
  −
|  <h, f, g>
  −
|  <h, g, f>
  −
|  <h, h, e>
  −
 
  −
It is part of the definition of a group that the 3-adic
  −
relation L c G^3 is actually a function L : G x G -> G.
  −
It is from this functional perspective that we can see
  −
an easy way to derive the two regular representations.
  −
Since we have a function of the type L : G x G -> G,
  −
we can define a couple of substitution operators:
  −
 
  −
1.  Sub(x, <_, y>) puts any specified x into
  −
    the empty slot of the rheme <_, y>, with
  −
    the effect of producing the saturated
  −
    rheme <x, y> that evaluates to x·y.
  −
 
  −
2.  Sub(x, <y, _>) puts any specified x into
  −
    the empty slot of the rheme <y, >, with
  −
    the effect of producing the saturated
  −
    rheme <y, x> that evaluates to y·x.
  −
 
  −
In (1), we consider the effects of each x in its
  −
practical bearing on contexts of the form <_, y>,
  −
as y ranges over G, and the effects are such that
  −
x takes <_, y> into x·y, for y in G, all of which
  −
is summarily notated as x = {(y : x·y) : y in G}.
  −
The pairs (y : x·y) can be found by picking an x
  −
from the left margin of the group operation table
  −
and considering its effects on each y in turn as
  −
these run across the top margin.  This aspect of
  −
pragmatic definition we recognize as the regular
  −
ante-representation:
  −
 
  −
    e  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
    f  =  e:f  +  f:e  +  g:h  +  h:g
  −
 
  −
    g  =  e:g  +  f:h  +  g:e  +  h:f
  −
 
  −
    h  =  e:h  +  f:g  +  g:f  +  h:e
  −
 
  −
In (2), we consider the effects of each x in its
  −
practical bearing on contexts of the form <y, _>,
  −
as y ranges over G, and the effects are such that
  −
x takes <y, _> into y·x, for y in G, all of which
  −
is summarily notated as x = {(y : y·x) : y in G}.
  −
The pairs (y : y·x) can be found by picking an x
  −
from the top margin of the group operation table
  −
and considering its effects on each y in turn as
  −
these run down the left margin.  This aspect of
  −
pragmatic definition we recognize as the regular
  −
post-representation:
  −
 
  −
    e  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
    f  =  e:f  +  f:e  +  g:h  +  h:g
  −
 
  −
    g  =  e:g  +  f:h  +  g:e  +  h:f
  −
 
  −
    h  =  e:h  +  f:g  +  g:f  +  h:e
  −
 
  −
If the ante-rep looks the same as the post-rep,
  −
now that I'm writing them in the same dialect,
  −
that is because V_4 is abelian (commutative),
  −
and so the two representations have the very
  −
same effects on each point of their bearing.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 19
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
So long as we're in the neighborhood, we might as well take in
  −
some more of the sights, for instance, the smallest example of
  −
a non-abelian (non-commutative) group.  This is a group of six
  −
elements, say, G = {e, f, g, h, i, j}, with no relation to any
  −
other employment of these six symbols being implied, of course,
  −
and it can be most easily represented as the permutation group
  −
on a set of three letters, say, X = {A, B, C}, usually notated
  −
as G = Sym(X) or more abstractly and briefly, as Sym(3) or S_3.
  −
Here are the permutation (= substitution) operations in Sym(X):
  −
 
  −
Table 2.  Permutations or Substitutions in Sym_{A, B, C}
  −
o---------o---------o---------o---------o---------o---------o
  −
|        |        |        |        |        |        |
  −
|    e    |    f    |    g    |    h    |    i    |    j    |
  −
|        |        |        |        |        |        |
  −
o=========o=========o=========o=========o=========o=========o
  −
|        |        |        |        |        |        |
  −
|  A B C  |  A B C  |  A B C  |  A B C  |  A B C  |  A B C  |
  −
|        |        |        |        |        |        |
  −
|  | | |  |  | | |  |  | | |  |  | | |  |  | | |  |  | | |  |
  −
|  v v v  |  v v v  |  v v v  |  v v v  |  v v v  |  v v v  |
  −
|        |        |        |        |        |        |
  −
|  A B C  |  C A B  |  B C A  |  A C B  |  C B A  |  B A C  |
  −
|        |        |        |        |        |        |
  −
o---------o---------o---------o---------o---------o---------o
  −
 
  −
Here is the operation table for S_3, given in abstract fashion:
  −
 
  −
Table 3.  Symmetric Group S_3
  −
 
  −
|                        _
  −
|                    e / \ e
  −
|                      /  \
  −
|                    /  e  \
  −
|                  f / \  / \ f
  −
|                  /  \ /  \
  −
|                  /  f  \  f  \
  −
|              g / \  / \  / \ g
  −
|                /  \ /  \ /  \
  −
|              /  g  \  g  \  g  \
  −
|            h / \  / \  / \  / \ h
  −
|            /  \ /  \ /  \ /  \
  −
|            /  h  \  e  \  e  \  h  \
  −
|        i / \  / \  / \  / \  / \ i
  −
|          /  \ /  \ /  \ /  \ /  \
  −
|        /  i  \  i  \  f  \  j  \  i  \
  −
|      j / \  / \  / \  / \  / \  / \ j
  −
|      /  \ /  \ /  \ /  \ /  \ /  \
  −
|      (  j  \  j  \  j  \  i  \  h  \  j  )
  −
|      \  / \  / \  / \  / \  / \  /
  −
|        \ /  \ /  \ /  \ /  \ /  \ /
  −
|        \  h  \  h  \  e  \  j  \  i  /
  −
|          \  / \  / \  / \  / \  /
  −
|          \ /  \ /  \ /  \ /  \ /
  −
|            \  i  \  g  \  f  \  h  /
  −
|            \  / \  / \  / \  /
  −
|              \ /  \ /  \ /  \ /
  −
|              \  f  \  e  \  g  /
  −
|                \  / \  / \  /
  −
|                \ /  \ /  \ /
  −
|                  \  g  \  f  /
  −
|                  \  / \  /
  −
|                    \ /  \ /
  −
|                    \  e  /
  −
|                      \  /
  −
|                      \ /
  −
|                        ¯
  −
 
  −
By the way, we will meet with the symmetric group S_3 again
  −
when we return to take up the study of Peirce's early paper
  −
"On a Class of Multiple Algebras" (CP 3.324-327), and also
  −
his late unpublished work "The Simplest Mathematics" (1902)
  −
(CP 4.227-323), with particular reference to the section
  −
that treats of "Trichotomic Mathematics" (CP 4.307-323).
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Work Area
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 20
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
By way of collecting a shot-term pay-off for all the work --
  −
not to mention the peirce-spiration -- that we sweated out
  −
over the regular representations of V_4 and S_3
  −
 
  −
Table 2.  Permutations or Substitutions in Sym_{A, B, C}
  −
o---------o---------o---------o---------o---------o---------o
  −
|        |        |        |        |        |        |
  −
|    e    |    f    |    g    |    h    |    i    |    j    |
  −
|        |        |        |        |        |        |
  −
o=========o=========o=========o=========o=========o=========o
  −
|        |        |        |        |        |        |
  −
|  A B C  |  A B C  |  A B C  |  A B C  |  A B C  |  A B C  |
  −
|        |        |        |        |        |        |
  −
|  | | |  |  | | |  |  | | |  |  | | |  |  | | |  |  | | |  |
  −
|  v v v  |  v v v  |  v v v  |  v v v  |  v v v  |  v v v  |
  −
|        |        |        |        |        |        |
  −
|  A B C  |  C A B  |  B C A  |  A C B  |  C B A  |  B A C  |
  −
|        |        |        |        |        |        |
  −
o---------o---------o---------o---------o---------o---------o
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Note 21
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
| Consider what effects that might 'conceivably'
  −
| have practical bearings you 'conceive' the
  −
| objects of your 'conception' to have.  Then,
  −
| your 'conception' of those effects is the
  −
| whole of your 'conception' of the object.
  −
|
  −
| Charles Sanders Peirce,
  −
| "Maxim of Pragmaticism", CP 5.438.
  −
 
  −
problem about writing
  −
 
  −
  e  =  e:e  +  f:f  +  g:g  +  h:h
  −
 
  −
no recursion intended
  −
need for a work-around
  −
ways way explaining it away
  −
 
  −
action on signs not objects
  −
 
  −
math def of rep
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Zeroth Order Logic
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Here is a scaled-down version of one of my very first applications,
  −
having to do with the demographic variables in a survey data base.
  −
 
  −
This Example illustrates the use of 2-variate logical forms
  −
for expressing and reasoning about the logical constraints
  −
that are involved in the following types of situations:
  −
 
  −
1.  Distinction:    A =/= B
  −
    Also known as:  logical inequality, exclusive disjunction
  −
    Represented as:  ( A , B )
  −
    Graphed as:
  −
    |
  −
    |  A  B
  −
    |  o---o
  −
    |    \ /
  −
    |    @
  −
 
  −
2.  Equality:        A = B
  −
    Also known as:  logical equivalence, if and only if, A <=> B
  −
    Represented as:  (( A , B ))
  −
    Graphed as:
  −
    |
  −
    |  A  B
  −
    |  o---o
  −
    |    \ /
  −
    |    o
  −
    |    |
  −
    |    @
  −
 
  −
3.  Implication:    A => B
  −
    Also known as:  entailment, if-then
  −
    Represented as:  ( A ( B ))
  −
    Graphed as:
  −
    |
  −
    |  A  B
  −
    |  o---o
  −
    |  |
  −
    |  @
  −
 
  −
Example of a proposition expressing a "zeroth order theory" (ZOT):
  −
 
  −
Consider the following text, written in what I am calling "Ref Log",
  −
also known as the "Cactus Language" synpropositional logic:
  −
 
  −
|  ( male  , female )
  −
|  (( boy  , male child ))
  −
|  (( girl , female child ))
  −
|  ( child ( human ))
  −
 
  −
Graphed as:
  −
 
  −
|                  boy  male    girl  female
  −
|                    o---o child    o---o child
  −
|  male  female      \ /            \ /          child  human
  −
|    o---o            o              o              o---o
  −
|      \ /              |              |              |
  −
|      @              @              @              @|
  −
 
  −
Nota Bene.  Due to graphic constraints -- no, the other
  −
kind of graphic constraints -- of the immediate medium,
  −
I am forced to string out the logical conjuncts of the
  −
actual cactus graph for this situation, one that might
  −
sufficiently be reasoned out from the exhibit supra by
  −
fusing together the four roots of the severed cactus.
  −
 
  −
Either of these expressions, text or graph, is equivalent to
  −
what would otherwise be written in a more ordinary syntax as:
  −
 
  −
|  male  =/=  female
  −
|  boy  <=>  male child
  −
|  girl  <=>  female child
  −
|  child  =>  human
  −
 
  −
This is a actually a single proposition, a conjunction of four lines:
  −
one distinction, two equations, and one implication.  Together these
  −
amount to a set of definitions conjointly constraining the logical
  −
compatibility of the six feature names that appear.  They may be
  −
thought of as sculpting out a space of models that is some subset
  −
of the 2^6 = 64 possible interpretations, and thereby shaping some
  −
universe of discourse.
  −
 
  −
Once this backdrop is defined, it is possible to "query" this universe,
  −
simply by conjoining additional propositions in further constraint of
  −
the underlying set of models.  This has many uses, as we shall see.
  −
 
  −
We are considering an Example of a propositional expression
  −
that is formed on the following "alphabet" or "lexicon" of
  −
six "logical features" or "boolean variables":
  −
 
  −
$A$  =  {"boy", "child", "female", "girl", "human", "male"}.
  −
 
  −
The expression is this:
  −
 
  −
|  ( male  , female )
  −
|  (( boy  , male child ))
  −
|  (( girl , female child ))
  −
|  ( child ( human ))
  −
 
  −
Putting it very roughly -- and putting off a better description
  −
of it till later -- we may think of this expression as notation
  −
for a boolean function f : %B%^6 -> %B%.  This is what we might
  −
call the "abstract type" of the function, but we will also find
  −
it convenient on many occasions to represent the points of this
  −
particular copy of the space %B%^6 in terms of the positive and
  −
negative versions of the features from $A$ that serve to encase
  −
them as logical "cells", as they are called in the venn diagram
  −
picture of the corresponding universe of discourse X = [$A$].
  −
 
  −
Just for concreteness, this form of representation begins and ends:
  −
 
  −
<0,0,0,0,0,0>  =  (boy)(child)(female)(girl)(human)(male),
  −
<0,0,0,0,0,1>  =  (boy)(child)(female)(girl)(human) male ,
  −
<0,0,0,0,1,0>  =  (boy)(child)(female)(girl) human (male),
  −
<0,0,0,0,1,1>  =  (boy)(child)(female)(girl) human  male ,
  −
...
  −
<1,1,1,1,0,0>  =  boy  child  female  girl (human)(male),
  −
<1,1,1,1,0,1>  =  boy  child  female  girl (human) male ,
  −
<1,1,1,1,1,0>  =  boy  child  female  girl  human (male),
  −
<1,1,1,1,1,1>  =  boy  child  female  girl  human  male .
  −
 
  −
I continue with the previous Example, that I bring forward and sum up here:
  −
 
  −
|                  boy  male          girl  female
  −
|                    o---o child          o---o child
  −
|  male  female      \ /                  \ /              child  human
  −
|    o---o            o                    o                    o--o
  −
|      \ /            |                    |                    |
  −
|      @              @                    @                    @
  −
|
  −
| (male , female)((boy , male child))((girl , female child))(child (human))
  −
 
  −
For my master's piece in Quantitative Psychology (Michigan State, 1989),
  −
I wrote a program, "Theme One" (TO) by name, that among its other duties
  −
operates to process the expressions of the cactus language in many of the
  −
most pressing ways that we need in order to be able to use it effectively
  −
as a propositional calculus.  The operational component of TO where one
  −
does the work of this logical modeling is called "Study", and the core
  −
of the logical calculator deep in the heart of this Study section is
  −
a suite of computational functions that evolve a particular species
  −
of "normal form", analogous to a "disjunctive normal form" (DNF),
  −
from whatever expression they are prebendered as their input.
  −
 
  −
This "canonical", "normal", or "stable" form of logical expression --
  −
I'll refine the distinctions among these subforms all in good time --
  −
permits succinct depiction as an "arboreal boolean expansion" (ABE).
  −
 
  −
Once again, the graphic limitations of this space prevail against
  −
any disposition that I might have to lay out a really substantial
  −
case before you, of the brand that might have a chance to impress
  −
you with the aptitude of this ilk of ABE in rooting out the truth
  −
of many a complexly obscurely subtly adamant whetstone of our wit.
  −
 
  −
So let me just illustrate the way of it with one conjunct of our Example.
  −
What follows will be a sequence of expressions, each one after the first
  −
being logically equal to the one that precedes it:
  −
 
  −
Step 1
  −
 
  −
|    g    fc
  −
|    o---o
  −
|      \ /
  −
|      o
  −
|      |
  −
|      @
  −
 
  −
Step 2
  −
 
  −
|                  o
  −
|        fc        |  fc
  −
|    o---o        o---o
  −
|      \ /          \ /
  −
|      o            o
  −
|      |            |
  −
|    g o-------------o--o g
  −
|        \          /
  −
|        \        /
  −
|          \      /
  −
|          \    /
  −
|            \  /
  −
|            \ /
  −
|              @
  −
 
  −
Step 3
  −
 
  −
|      f c
  −
|      o
  −
|      |            f c
  −
|      o            o
  −
|      |            |
  −
|    g o-------------o--o g
  −
|        \          /
  −
|        \        /
  −
|          \      /
  −
|          \    /
  −
|            \  /
  −
|            \ /
  −
|              @
  −
 
  −
Step 4
  −
 
  −
|          o
  −
|          |
  −
|    c o  o c          o
  −
|      |  |            |
  −
|      o  o      c o  o c
  −
|      |  |        |  |
  −
|    f o---o--o f  f o---o--o f
  −
|        \ /          \ /
  −
|      g o-------------o--o g
  −
|          \          /
  −
|          \        /
  −
|            \      /
  −
|            \    /
  −
|              \  /
  −
|              \ /
  −
|                @
  −
 
  −
Step 5
  −
 
  −
|          o      c o
  −
|      c  |        |
  −
|    f o---o--o f  f o---o--o f
  −
|        \ /          \ /
  −
|      g o-------------o--o g
  −
|          \          /
  −
|          \        /
  −
|            \      /
  −
|            \    /
  −
|              \  /
  −
|              \ /
  −
|                @
  −
 
  −
Step 6
  −
 
  −
|                                      o
  −
|                                      |
  −
|          o                      o  o
  −
|          |                      |  |
  −
|    c o---o--o c      o        c o---o--o c
  −
|        \ /            |            \ /
  −
|      f o-------------o--o f      f o-------------o--o f
  −
|          \          /              \          /
  −
|          \        /                \        /
  −
|            \      /                  \      /
  −
|            \    /                    \    /
  −
|              \  /                      \  /
  −
|              \ /                        \ /
  −
|              g o---------------------------o--o g
  −
|                \                        /
  −
|                  \                      /
  −
|                  \                    /
  −
|                    \                  /
  −
|                    \                /
  −
|                      \              /
  −
|                      \            /
  −
|                        \          /
  −
|                        \        /
  −
|                          \      /
  −
|                          \    /
  −
|                            \  /
  −
|                            \ /
  −
|                              @
  −
 
  −
Step 7
  −
 
  −
|          o                      o
  −
|          |                      |
  −
|    c o---o--o c      o        c o---o--o c
  −
|        \ /            |            \ /
  −
|      f o-------------o--o f      f o-------------o--o f
  −
|          \          /              \          /
  −
|          \        /                \        /
  −
|            \      /                  \      /
  −
|            \    /                    \    /
  −
|              \  /                      \  /
  −
|              \ /                        \ /
  −
|              g o---------------------------o--o g
  −
|                \                        /
  −
|                  \                      /
  −
|                  \                    /
  −
|                    \                  /
  −
|                    \                /
  −
|                      \              /
  −
|                      \            /
  −
|                        \          /
  −
|                        \        /
  −
|                          \      /
  −
|                          \    /
  −
|                            \  /
  −
|                            \ /
  −
|                              @
  −
 
  −
This last expression is the ABE of the input expression.
  −
It can be transcribed into ordinary logical language as:
  −
 
  −
| either girl and
  −
|        either female and
  −
|              either child and true
  −
|              or not child and false
  −
|        or not female and false
  −
| or not girl and
  −
|        either female and
  −
|              either child and false
  −
|              or not child and true
  −
|        or not female and true
  −
 
  −
The expression "((girl , female child))" is sufficiently evaluated
  −
by considering its logical values on the coordinate tuples of %B%^3,
  −
or its indications on the cells of the associated venn diagram that
  −
depicts the universe of discourse, namely, on these eight arguments:
  −
     
  −
<1, 1, 1>  =  girl  female  child ,
  −
<1, 1, 0>  =  girl  female (child),
  −
<1, 0, 1>  =  girl (female) child ,
  −
<1, 0, 0>  =  girl (female)(child),
  −
<0, 1, 1>  =  (girl) female  child ,
  −
<0, 1, 0>  =  (girl) female (child),
  −
<0, 0, 1>  =  (girl)(female) child ,
  −
<0, 0, 0>  =  (girl)(female)(child).
  −
 
  −
The ABE output expression tells us the logical values of
  −
the input expression on each of these arguments, doing so
  −
by attaching the values to the leaves of a tree, and acting
  −
as an "efficient" or "lazy" evaluator in the sense that the
  −
process that generates the tree follows each path only up to
  −
the point in the tree where it can determine the values on the
  −
entire subtree beyond that point.  Thus, the ABE tree tells us:
  −
 
  −
girl  female  child  -> 1
  −
girl  female (child)  -> 0
  −
girl (female) -> 0
  −
(girl) female  child  -> 0
  −
(girl) female (child)  -> 1
  −
(girl)(female) -> 1
  −
 
  −
Picking out the interpretations that yield the truth of the expression,
  −
and expanding the corresponding partial argument tuples, we arrive at
  −
the following interpretations that satisfy the input expression:
  −
 
  −
girl  female  child  -> 1
  −
(girl) female (child)  -> 1
  −
(girl)(female) child  -> 1
  −
(girl)(female)(child)  -> 1
  −
 
  −
In sum, if it's a female and a child, then it's a girl,
  −
and if it's either not a female or not a child or both,
  −
then it's not a girl.
  −
 
  −
Brief Automata
  −
 
  −
By way of providing a simple illustration of Cook's Theorem,
  −
that "Propositional Satisfiability is NP-Complete", here is
  −
an exposition of one way to translate Turing Machine set-ups
  −
into propositional expressions, employing the Ref Log Syntax
  −
for Prop Calc that I described in a couple of earlier notes:
  −
 
  −
Notation:
  −
 
  −
Stilt(k)  =  Space and Time Limited Turing Machine,
  −
            with k units of space and k units of time.
  −
 
  −
Stunt(k)  =  Space and Time Limited Turing Machine,
  −
            for computing the parity of a bit string,
  −
            with Number of Tape cells of input equal to k.
  −
 
  −
I will follow the pattern of the discussion in the book of
  −
Herbert Wilf, 'Algorithms & Complexity' (1986), pages 188-201,
  −
but translate into Ref Log, which is more efficient with respect
  −
to the number of propositional clauses that are required.
  −
 
  −
Parity Machine
  −
 
  −
|                    1/1/+1
  −
|                  ------->
  −
|              /\ /        \ /\
  −
|      0/0/+1  ^  0          1  ^  0/0/+1
  −
|              \/|\        /|\/
  −
|                | <------- |
  −
|        #/#/-1  |  1/1/+1  |  #/#/-1
  −
|                |          |
  −
|                v          v
  −
|                #          *
  −
 
  −
o-------o--------o-------------o---------o------------o
  −
| State | Symbol | Next Symbol | Ratchet | Next State |
  −
|  Q  |  S    |    S'      |  dR    |    Q'    |
  −
o-------o--------o-------------o---------o------------o
  −
|  0  |  0    |    0      |  +1    |    0      |
  −
|  0  |  1    |    1      |  +1    |    1      |
  −
|  0  |  #    |    #      |  -1    |    #      |
  −
|  1  |  0    |    0      |  +1    |    1      |
  −
|  1  |  1    |    1      |  +1    |    0      |
  −
|  1  |  #    |    #      |  -1    |    *      |
  −
o-------o--------o-------------o---------o------------o
  −
 
  −
The TM has a "finite automaton" (FA) as its component.
  −
Let us refer to this particular FA by the name of "M".
  −
 
  −
The "tape-head" (that is, the "read-unit") will be called "H".
  −
The "registers" are also called "tape-cells" or "tape-squares".
  −
 
  −
In order to consider how the finitely "stilted" rendition of this TM
  −
can be translated into the form of a purely propositional description,
  −
one now fixes k and limits the discussion to talking about a Stilt(k),
  −
which is really not a true TM anymore but a finite automaton in disguise.
  −
 
  −
In this example, for the sake of a minimal illustration, we choose k = 2,
  −
and discuss Stunt(2).  Since the zeroth tape cell and the last tape cell
  −
are occupied with bof and eof marks "#", this amounts to only one digit
  −
of significant computation.
  −
 
  −
To translate Stunt(2) into propositional form we use
  −
the following collection of propositional variables:
  −
 
  −
For the "Present State Function" QF : P -> Q,
  −
 
  −
{p0_q#, p0_q*, p0_q0, p0_q1,
  −
p1_q#, p1_q*, p1_q0, p1_q1,
  −
p2_q#, p2_q*, p2_q0, p2_q1,
  −
p3_q#, p3_q*, p3_q0, p3_q1}
  −
 
  −
The propositional expression of the form "pi_qj" says:
  −
 
  −
| At the point-in-time p_i,
  −
| the finite machine M is in the state q_j.
  −
 
  −
For the "Present Register Function" RF : P -> R,
  −
 
  −
{p0_r0, p0_r1, p0_r2, p0_r3,
  −
p1_r0, p1_r1, p1_r2, p1_r3,
  −
p2_r0, p2_r1, p2_r2, p2_r3,
  −
p3_r0, p3_r1, p3_r2, p3_r3}
  −
 
  −
The propositional expression of the form "pi_rj" says:
  −
 
  −
| At the point-in-time p_i,
  −
| the tape-head H is on the tape-cell r_j.
  −
 
  −
For the "Present Symbol Function" SF : P -> (R -> S),
  −
 
  −
{p0_r0_s#, p0_r0_s*, p0_r0_s0, p0_r0_s1,
  −
p0_r1_s#, p0_r1_s*, p0_r1_s0, p0_r1_s1,
  −
p0_r2_s#, p0_r2_s*, p0_r2_s0, p0_r2_s1,
  −
p0_r3_s#, p0_r3_s*, p0_r3_s0, p0_r3_s1,
  −
p1_r0_s#, p1_r0_s*, p1_r0_s0, p1_r0_s1,
  −
p1_r1_s#, p1_r1_s*, p1_r1_s0, p1_r1_s1,
  −
p1_r2_s#, p1_r2_s*, p1_r2_s0, p1_r2_s1,
  −
p1_r3_s#, p1_r3_s*, p1_r3_s0, p1_r3_s1,
  −
p2_r0_s#, p2_r0_s*, p2_r0_s0, p2_r0_s1,
  −
p2_r1_s#, p2_r1_s*, p2_r1_s0, p2_r1_s1,
  −
p2_r2_s#, p2_r2_s*, p2_r2_s0, p2_r2_s1,
  −
p2_r3_s#, p2_r3_s*, p2_r3_s0, p2_r3_s1,
  −
p3_r0_s#, p3_r0_s*, p3_r0_s0, p3_r0_s1,
  −
p3_r1_s#, p3_r1_s*, p3_r1_s0, p3_r1_s1,
  −
p3_r2_s#, p3_r2_s*, p3_r2_s0, p3_r2_s1,
  −
p3_r3_s#, p3_r3_s*, p3_r3_s0, p3_r3_s1}
  −
 
  −
The propositional expression of the form "pi_rj_sk" says:
  −
 
  −
| At the point-in-time p_i,
  −
| the tape-cell r_j bears the mark s_k.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~INPUTS~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Here are the Initial Conditions
  −
for the two possible inputs to the
  −
Ref Log redaction of this Parity TM:
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~INPUT~0~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Initial Conditions:
  −
 
  −
p0_q0
  −
 
  −
p0_r1
  −
 
  −
p0_r0_s#
  −
p0_r1_s0
  −
p0_r2_s#
  −
 
  −
The Initial Conditions are given by a logical conjunction
  −
that is composed of 5 basic expressions, altogether stating:
  −
 
  −
| At the point-in-time p_0, M is in the state q_0, and
  −
| At the point-in-time p_0, H is on the cell  r_1, and
  −
| At the point-in-time p_0, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_0, cell r_1 bears the mark "0", and
  −
| At the point-in-time p_0, cell r_2 bears the mark "#".
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~INPUT~1~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Initial Conditions:
  −
 
  −
p0_q0
  −
 
  −
p0_r1
  −
 
  −
p0_r0_s#
  −
p0_r1_s1
  −
p0_r2_s#
  −
 
  −
The Initial Conditions are given by a logical conjunction
  −
that is composed of 5 basic expressions, altogether stating:
  −
 
  −
| At the point-in-time p_0, M is in the state q_0, and
  −
| At the point-in-time p_0, H is on the cell  r_1, and
  −
| At the point-in-time p_0, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_0, cell r_1 bears the mark "1", and
  −
| At the point-in-time p_0, cell r_2 bears the mark "#".
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~PROGRAM~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
And here, yet again, just to store it nearby,
  −
is the logical rendition of the TM's program:
  −
 
  −
Mediate Conditions:
  −
 
  −
( p0_q#  ( p1_q# ))
  −
( p0_q*  ( p1_q* ))
  −
 
  −
( p1_q#  ( p2_q# ))
  −
( p1_q*  ( p2_q* ))
  −
 
  −
Terminal Conditions:
  −
 
  −
(( p2_q# )( p2_q* ))
  −
 
  −
State Partition:
  −
 
  −
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
  −
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
  −
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
  −
 
  −
Register Partition:
  −
 
  −
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
  −
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
  −
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
  −
 
  −
Symbol Partition:
  −
 
  −
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
  −
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
  −
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
  −
 
  −
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
  −
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
  −
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
  −
 
  −
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
  −
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
  −
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
  −
 
  −
Interaction Conditions:
  −
 
  −
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
  −
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
  −
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
  −
 
  −
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
  −
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
  −
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
  −
 
  −
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
  −
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
  −
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
  −
 
  −
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
  −
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
  −
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
  −
 
  −
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
  −
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
  −
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
  −
 
  −
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
  −
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
  −
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
  −
 
  −
Transition Relations:
  −
 
  −
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
  −
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
  −
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
  −
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
  −
 
  −
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
  −
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
  −
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
  −
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
  −
 
  −
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
  −
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
  −
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
  −
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
  −
 
  −
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
  −
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
  −
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
  −
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~INTERPRETATION~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Interpretation of the Propositional Program:
  −
 
  −
Mediate Conditions:
  −
 
  −
( p0_q#  ( p1_q# ))
  −
( p0_q*  ( p1_q* ))
  −
 
  −
( p1_q#  ( p2_q# ))
  −
( p1_q*  ( p2_q* ))
  −
 
  −
In Ref Log, an expression of the form "( X ( Y ))"
  −
expresses an implication or an if-then proposition:
  −
"Not X without Y",  "If X then Y",  "X => Y",  etc.
  −
 
  −
A text string expression of the form "( X ( Y ))"
  −
parses to a graphical data-structure of the form:
  −
 
  −
    X  Y
  −
    o---o
  −
    |
  −
    @
  −
 
  −
All together, these Mediate Conditions state:
  −
 
  −
| If at p_0  M is in state q_#, then at p_1  M is in state q_#, and
  −
| If at p_0  M is in state q_*, then at p_1  M is in state q_*, and
  −
| If at p_1  M is in state q_#, then at p_2  M is in state q_#, and
  −
| If at p_1  M is in state q_*, then at p_2  M is in state q_*.
  −
 
  −
Terminal Conditions:
  −
 
  −
(( p2_q# )( p2_q* ))
  −
 
  −
In Ref Log, an expression of the form "(( X )( Y ))"
  −
expresses a disjunction "X or Y" and it parses into:
  −
 
  −
    X  Y
  −
    o  o
  −
    \ /
  −
      o
  −
      |
  −
      @
  −
 
  −
In effect, the Terminal Conditions state:
  −
 
  −
| At p_2,  M is in state q_#, or
  −
| At p_2,  M is in state q_*.
  −
 
  −
State Partition:
  −
 
  −
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
  −
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
  −
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
  −
 
  −
In Ref Log, an expression of the form "(( e_1 ),( e_2 ),( ... ),( e_k ))"
  −
expresses the fact that "exactly one of the e_j is true, for j = 1 to k".
  −
Expressions of this form are called "universal partition" expressions, and
  −
they parse into a type of graph called a "painted and rooted cactus" (PARC):
  −
 
  −
    e_1  e_2  ...  e_k
  −
    o    o          o
  −
    |    |          |
  −
    o-----o--- ... ---o
  −
      \              /
  −
      \            /
  −
        \          /
  −
        \        /
  −
          \      /
  −
          \    /
  −
            \  /
  −
            \ /
  −
              @
  −
 
  −
The State Partition expresses the conditions that:
  −
 
  −
| At each of the points-in-time p_i, for i = 0 to 2,
  −
| M can be in exactly one state q_j, for j in the set {0, 1, #, *}.
  −
 
  −
Register Partition:
  −
 
  −
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
  −
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
  −
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
  −
 
  −
The Register Partition expresses the conditions that:
  −
 
  −
| At each of the points-in-time p_i, for i = 0 to 2,
  −
| H can be on exactly one cell  r_j, for j = 0 to 2.
  −
 
  −
Symbol Partition:
  −
 
  −
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
  −
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
  −
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
  −
 
  −
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
  −
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
  −
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
  −
 
  −
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
  −
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
  −
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
  −
 
  −
The Symbol Partition expresses the conditions that:
  −
 
  −
| At each of the points-in-time p_i, for i in {0, 1, 2},
  −
| in each of the tape-registers r_j, for j in {0, 1, 2},
  −
| there can be exactly one sign s_k, for k in {0, 1, #}.
  −
 
  −
Interaction Conditions:
  −
 
  −
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
  −
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
  −
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
  −
 
  −
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
  −
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
  −
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
  −
 
  −
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
  −
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
  −
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
  −
 
  −
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
  −
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
  −
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
  −
 
  −
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
  −
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
  −
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
  −
 
  −
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
  −
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
  −
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
  −
 
  −
In briefest terms, the Interaction Conditions merely express
  −
the circumstance that the sign in a tape-cell cannot change
  −
between two points-in-time unless the tape-head is over the
  −
cell in question at the initial one of those points-in-time.
  −
All that we have to do is to see how they manage to say this.
  −
 
  −
In Ref Log, an expression of the following form:
  −
 
  −
"(( p<i>_r<j> ) p<i>_r<j>_s<k> ( p<i+1>_r<j>_s<k> ))",
  −
 
  −
and which parses as the graph:
  −
 
  −
      p<i>_r<j> o  o  p<i+1>_r<j>_s<k>
  −
                  \ /
  −
    p<i>_r<j>_s<k> o
  −
                  |
  −
                  @
  −
 
  −
can be read in the form of the following implication:
  −
 
  −
| If
  −
| at the point-in-time p<i>, the tape-cell r<j> bears the mark s<k>,
  −
| but it is not the case that
  −
| at the point-in-time p<i>, the tape-head is on the tape-cell r<j>.
  −
| then
  −
| at the point-in-time p<i+1>, the tape-cell r<j> bears the mark s<k>.
  −
 
  −
Folks among us of a certain age and a peculiar manner of acculturation will
  −
recognize these as the "Frame Conditions" for the change of state of the TM.
  −
 
  −
Transition Relations:
  −
 
  −
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
  −
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
  −
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
  −
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
  −
 
  −
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
  −
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
  −
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
  −
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
  −
 
  −
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
  −
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
  −
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
  −
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
  −
 
  −
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
  −
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
  −
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
  −
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
  −
 
  −
The Transition Conditions merely serve to express,
  −
by means of 16 complex implication expressions,
  −
the data of the TM table that was given above.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~OUTPUTS~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
And here are the outputs of the computation,
  −
as emulated by its propositional rendition,
  −
and as actually generated within that form
  −
of transmogrification by the program that
  −
I wrote for finding all of the satisfying
  −
interpretations (truth-value assignments)
  −
of propositional expressions in Ref Log:
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~OUTPUT~0~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Output Conditions:
  −
 
  −
p0_q0
  −
  p0_r1
  −
  p0_r0_s#
  −
    p0_r1_s0
  −
    p0_r2_s#
  −
      p1_q0
  −
      p1_r2
  −
        p1_r2_s#
  −
        p1_r0_s#
  −
          p1_r1_s0
  −
          p2_q#
  −
            p2_r1
  −
            p2_r0_s#
  −
              p2_r1_s0
  −
              p2_r2_s#
  −
 
  −
The Output Conditions amount to the sole satisfying interpretation,
  −
that is, a "sequence of truth-value assignments" (SOTVA) that make
  −
the entire proposition come out true, and they state the following:
  −
 
  −
| At the point-in-time p_0, M is in the state q_0,      and
  −
| At the point-in-time p_0, H is on the cell  r_1,      and
  −
| At the point-in-time p_0, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_0, cell r_1 bears the mark "0", and
  −
| At the point-in-time p_0, cell r_2 bears the mark "#", and
  −
|
  −
| At the point-in-time p_1, M is in the state q_0,      and
  −
| At the point-in-time p_1, H is on the cell  r_2,      and
  −
| At the point-in-time p_1, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_1, cell r_1 bears the mark "0", and
  −
| At the point-in-time p_1, cell r_2 bears the mark "#", and
  −
|
  −
| At the point-in-time p_2, M is in the state q_#,      and
  −
| At the point-in-time p_2, H is on the cell  r_1,      and
  −
| At the point-in-time p_2, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_2, cell r_1 bears the mark "0", and
  −
| At the point-in-time p_2, cell r_2 bears the mark "#".
  −
 
  −
In brief, the output for our sake being the symbol that rests
  −
under the tape-head H when the machine M gets to a rest state,
  −
we are now amazed by the remarkable result that Parity(0) = 0.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~OUTPUT~1~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Output Conditions:
  −
 
  −
p0_q0
  −
  p0_r1
  −
  p0_r0_s#
  −
    p0_r1_s1
  −
    p0_r2_s#
  −
      p1_q1
  −
      p1_r2
  −
        p1_r2_s#
  −
        p1_r0_s#
  −
          p1_r1_s1
  −
          p2_q*
  −
            p2_r1
  −
            p2_r0_s#
  −
              p2_r1_s1
  −
              p2_r2_s#
  −
 
  −
The Output Conditions amount to the sole satisfying interpretation,
  −
that is, a "sequence of truth-value assignments" (SOTVA) that make
  −
the entire proposition come out true, and they state the following:
  −
 
  −
| At the point-in-time p_0, M is in the state q_0,      and
  −
| At the point-in-time p_0, H is on the cell  r_1,      and
  −
| At the point-in-time p_0, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_0, cell r_1 bears the mark "1", and
  −
| At the point-in-time p_0, cell r_2 bears the mark "#", and
  −
|
  −
| At the point-in-time p_1, M is in the state q_1,      and
  −
| At the point-in-time p_1, H is on the cell  r_2,      and
  −
| At the point-in-time p_1, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_1, cell r_1 bears the mark "1", and
  −
| At the point-in-time p_1, cell r_2 bears the mark "#", and
  −
|
  −
| At the point-in-time p_2, M is in the state q_*,      and
  −
| At the point-in-time p_2, H is on the cell  r_1,      and
  −
| At the point-in-time p_2, cell r_0 bears the mark "#", and
  −
| At the point-in-time p_2, cell r_1 bears the mark "1", and
  −
| At the point-in-time p_2, cell r_2 bears the mark "#".
  −
 
  −
In brief, the output for our sake being the symbol that rests
  −
under the tape-head H when the machine M gets to a rest state,
  −
we are now amazed by the remarkable result that Parity(1) = 1.
  −
 
  −
I realized after sending that last bunch of bits that there is room
  −
for confusion about what is the input/output of the Study module of
  −
the Theme One program as opposed to what is the input/output of the
  −
"finitely approximated turing automaton" (FATA).  So here is better
  −
delineation of what's what.  The input to Study is a text file that
  −
is known as LogFile(Whatever) and the output of Study is a sequence
  −
of text files that summarize the various canonical and normal forms
  −
that it generates.  For short, let us call these NormFile(Whatelse).
  −
With that in mind, here are the actual IO's of Study, excluding the
  −
glosses in square brackets:
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~INPUT~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
[Input To Study = FATA Initial Conditions + FATA Program Conditions]
  −
 
  −
[FATA Initial Conditions For Input 0]
  −
 
  −
p0_q0
  −
 
  −
p0_r1
  −
 
  −
p0_r0_s#
  −
p0_r1_s0
  −
p0_r2_s#
  −
 
  −
[FATA Program Conditions For Parity Machine]
  −
 
  −
[Mediate Conditions]
  −
 
  −
( p0_q#  ( p1_q# ))
  −
( p0_q*  ( p1_q* ))
  −
 
  −
( p1_q#  ( p2_q# ))
  −
( p1_q*  ( p2_q* ))
  −
 
  −
[Terminal Conditions]
  −
 
  −
(( p2_q# )( p2_q* ))
  −
 
  −
[State Partition]
  −
 
  −
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
  −
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
  −
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
  −
 
  −
[Register Partition]
  −
 
  −
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
  −
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
  −
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
  −
 
  −
[Symbol Partition]
  −
 
  −
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
  −
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
  −
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
  −
 
  −
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
  −
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
  −
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
  −
 
  −
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
  −
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
  −
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
  −
 
  −
[Interaction Conditions]
  −
 
  −
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
  −
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
  −
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
  −
 
  −
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
  −
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
  −
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
  −
 
  −
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
  −
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
  −
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
  −
 
  −
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
  −
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
  −
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
  −
 
  −
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
  −
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
  −
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
  −
 
  −
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
  −
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
  −
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
  −
 
  −
[Transition Relations]
  −
 
  −
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
  −
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
  −
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
  −
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
  −
 
  −
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
  −
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
  −
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
  −
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
  −
 
  −
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
  −
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
  −
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
  −
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
  −
 
  −
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
  −
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
  −
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
  −
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~OUTPUT~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
[Output Of Study = FATA Output For Input 0]
  −
 
  −
p0_q0
  −
  p0_r1
  −
  p0_r0_s#
  −
    p0_r1_s0
  −
    p0_r2_s#
  −
      p1_q0
  −
      p1_r2
  −
        p1_r2_s#
  −
        p1_r0_s#
  −
          p1_r1_s0
  −
          p2_q#
  −
            p2_r1
  −
            p2_r0_s#
  −
              p2_r1_s0
  −
              p2_r2_s#
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Turing automata, finitely approximated or not, make my head spin and
  −
my tape go loopy, and I still believe 'twere a far better thing I do
  −
if I work up to that level of complexity in a more gracile graduated
  −
manner.  So let us return to our Example in this gradual progress to
  −
that vastly more well-guarded grail of our long-term pilgrim's quest:
  −
 
  −
|                  boy  male          girl  female
  −
|                    o---o child          o---o child
  −
|  male  female      \ /                  \ /              child  human
  −
|    o---o            o                    o                    o--o
  −
|      \ /            |                    |                    |
  −
|      @              @                    @                    @
  −
|
  −
| (male , female)((boy , male child))((girl , female child))(child (human))
  −
 
  −
One section of the Theme One program has a suite of utilities that fall
  −
under the title "Theme One Study" ("To Study", or just "TOS" for short).
  −
To Study is to read and to parse a so-called and a generally so-suffixed
  −
"log" file, and then to conjoin what is called a "query", which is really
  −
just an additional propositional expression that imposes a further logical
  −
constraint on the input expression.
  −
 
  −
The Figure roughly sketches the conjuncts of the graph-theoretic
  −
data structure that the parser would commit to memory on reading
  −
the appropriate log file that contains the text along the bottom.
  −
 
  −
I will now explain the various sorts of things that the TOS utility
  −
can do with the log file that describes the universe of discourse in
  −
our present Example.
  −
 
  −
Theme One Study is built around a suite of four successive generators
  −
of "normal forms" for propositional expressions, just to use that term
  −
in a very approximate way.  The functions that compute these normal forms
  −
are called "Model", "Tenor", "Canon", and "Sense", and so we may refer to
  −
to their text-style outputs as the "mod", "ten", "can", and "sen" files.
  −
 
  −
Though it could be any propositional expression on the same vocabulary
  −
$A$ = {"boy", "child", "female", "girl", "human", "male"}, more usually
  −
the query is a simple conjunction of one or more positive features that
  −
we want to focus on or perhaps to filter out of the logical model space.
  −
On our first run through this Example, we take the log file proposition
  −
as it is, with no extra riders.
  −
 
  −
| Procedural Note.  TO Study Model displays a running tab of how much
  −
| free memory space it has left.  On some of the harder problems that
  −
| you may think of to give it, Model may run out of free memory and
  −
| terminate, abnormally exiting Theme One.  Sometimes it helps to:
  −
|
  −
| 1.  Rephrase the problem in logically equivalent
  −
|    but rhetorically increasedly felicitous ways.
  −
|
  −
| 2.  Think of additional facts that are taken for granted but not
  −
|    made explicit and that cannot be logically inferred by Model.
  −
 
  −
After Model has finished, it is ready to write out its mod file,
  −
which you may choose to show on the screen or save to a named file.
  −
Mod files are usually too long to see (or to care to see) all at once
  −
on the screen, so it is very often best to save them for later replay.
  −
In our Example the Model function yields a mod file that looks like so:
  −
 
  −
Model Output and
  −
Mod File Example
  −
o-------------------o
  −
| male              |
  −
|  female -        |  1
  −
|  (female )        |
  −
|  girl -          |  2
  −
|  (girl )        |
  −
|    child          |
  −
|    boy          |
  −
|      human *      |  3 *
  −
|      (human ) -  |  4
  −
|    (boy ) -      |  5
  −
|    (child )      |
  −
|    boy -        |  6
  −
|    (boy ) *      |  7 *
  −
| (male )          |
  −
|  female          |
  −
|  boy -          |  8
  −
|  (boy )          |
  −
|    child          |
  −
|    girl          |
  −
|      human *      |  9 *
  −
|      (human ) -  | 10
  −
|    (girl ) -    | 11
  −
|    (child )      |
  −
|    girl -        | 12
  −
|    (girl ) *    | 13 *
  −
|  (female ) -      | 14
  −
o-------------------o
  −
 
  −
Counting the stars "*" that indicate true interpretations
  −
and the bars "-" that indicate false interpretations of
  −
the input formula, we can see that the Model function,
  −
out of the 64 possible interpretations, has actually
  −
gone through the work of making just 14 evaluations,
  −
all in order to find the 4 models that are allowed
  −
by the input definitions.
  −
 
  −
To be clear about what this output means, the starred paths
  −
indicate all of the complete specifications of objects in the
  −
universe of discourse, that is, all of the consistent feature
  −
conjunctions of maximum length, as permitted by the definitions
  −
that are given in the log file.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Let's take a little break from the Example in progress
  −
and look at where we are and what we have been doing from
  −
computational, logical, and semiotic perspectives.  Because,
  −
after all, as is usually the case, we should not let our focus
  −
and our fascination with this particular Example prevent us from
  −
recognizing it, and all that we do with it, as just an Example of
  −
much broader paradigms and predicaments and principles, not to say
  −
but a glimmer of ultimately more concernful and fascinating objects.
  −
 
  −
I chart the progression that we have just passed through in this way:
  −
 
  −
|                    Parse
  −
|      Sign A  o-------------->o  Sign 1
  −
|            ^                |
  −
|            /                |
  −
|          /                  |
  −
|          /                  |
  −
| Object  o                    |  Transform
  −
|          ^                  |
  −
|          \                  |
  −
|            \                |
  −
|            \                v
  −
|      Sign B  o<--------------o  Sign 2
  −
|                    Verse
  −
|
  −
| Figure.  Computation As Sign Transformation
  −
 
  −
In the present case, the Object is an objective situation
  −
or a state of affairs, in effect, a particular pattern of
  −
feature concurrences occurring to us in that world through
  −
which we find ourselves most frequently faring, wily nily,
  −
and the Signs are different tokens and different types of
  −
data structures that we somehow or other find it useful
  −
to devise or to discover for the sake of representing
  −
current objects to ourselves on a recurring basis.
  −
 
  −
But not all signs, not even signs of a single object, are alike
  −
in every other respect that one might name, not even with respect
  −
to their powers of relating, significantly, to that common object.
  −
 
  −
And that is what our whole business of computation busies itself about,
  −
when it minds its business best, that is, transmuting signs into signs
  −
in ways that augment their powers of relating significantly to objects.
  −
 
  −
We have seen how the Model function and the mod output format
  −
indicate all of the complete specifications of objects in the
  −
universe of discourse, that is, all of the consistent feature
  −
conjunctions of maximal specificity that are permitted by the
  −
constraints or the definitions that are given in the log file.
  −
 
  −
To help identify these specifications of particular cells in
  −
the universe of discourse, the next function and output format,
  −
called "Tenor", edits the mod file to give only the true paths,
  −
in effect, the "positive models", that are by default what we
  −
usually mean when we say "models", and not the "anti-models"
  −
or the "negative models" that fail to satisfy the formula
  −
in question.
  −
 
  −
In the present Example the Tenor function
  −
generates a Ten file that looks like this:
  −
 
  −
Tenor Output and
  −
Ten File Example
  −
o-------------------o
  −
| male              |
  −
|  (female )        |
  −
|  (girl )        |
  −
|    child          |
  −
|    boy          |
  −
|      human *      | <1>
  −
|    (child )      |
  −
|    (boy ) *      | <2>
  −
| (male )          |
  −
|  female          |
  −
|  (boy )          |
  −
|    child          |
  −
|    girl          |
  −
|      human *      | <3>
  −
|    (child )      |
  −
|    (girl ) *    | <4>
  −
o-------------------o
  −
 
  −
As I said, the Tenor function just abstracts a transcript of the models,
  −
that is, the satisfying interpretations, that were already interspersed
  −
throughout the complete Model output.  These specifications, or feature
  −
conjunctions, with the positive and the negative features listed in the
  −
order of their actual budding on the "arboreal boolean expansion" twigs,
  −
may be gathered and arranged in this antherypulogical flowering bouquet:
  −
 
  −
1.  male  (female ) (girl )  child    boy    human  *
  −
2.  male  (female ) (girl ) (child ) (boy  )          *
  −
3.  (male )  female  (boy  )  child    girl    human  *
  −
4.  (male )  female  (boy  ) (child ) (girl )          *
  −
 
  −
Notice that Model, as reflected in this abstract, did not consider
  −
the six positive features in the same order along each path.  This
  −
is because the algorithm was designed to proceed opportunistically
  −
in its attempt to reduce the original proposition through a series
  −
of case-analytic considerations and the resulting simplifications.
  −
 
  −
Notice, too, that Model is something of a lazy evaluator, quitting work
  −
when and if a value is determined by less than the full set of variables.
  −
This is the reason why paths <2> and <4> are not ostensibly of the maximum
  −
length.  According to this lazy mode of understanding, any path that is not
  −
specified on a set of features really stands for the whole bundle of paths
  −
that are derived by freely varying those features.  Thus, specifications
  −
<2> and <4> summarize four models altogether, with the logical choice
  −
between "human" and "not human" being left open at the point where
  −
they leave off their branches in the releavent deciduous tree.
  −
 
  −
The last two functions in the Study section, "Canon" and "Sense",
  −
extract further derivatives of the normal forms that are produced
  −
by Model and Tenor.  Both of these functions take the set of model
  −
paths and simply throw away the negative labels.  You may think of
  −
these as the "rose colored glasses" or "job interview" normal forms,
  −
in that they try to say everything that's true, so long as it can be
  −
expressed in positive terms.  Generally, this would mean losing a lot
  −
of information, and the result could no longer be expected to have the
  −
property of remaining logically equivalent to the original proposition.
  −
 
  −
Fortunately, however, it seems that this type of positive projection of
  −
the whole truth is just what is possible, most needed, and most clear in
  −
many of the "natural" examples, that is, in examples that arise from the
  −
domains of natural language and natural conceptual kinds.  In these cases,
  −
where most of the logical features are redundantly coded, for example, in
  −
the way that "adult" = "not child" and "child" = "not adult", the positive
  −
feature bearing redacts are often sufficiently expressive all by themselves.
  −
 
  −
Canon merely censors its printing of the negative labels as it traverses the
  −
model tree.  This leaves the positive labels in their original columns of the
  −
outline form, giving it a slightly skewed appearance.  This can be misleading
  −
unless you already know what you are looking for.  However, this Canon format
  −
is computationally quick, and frequently suffices, especially if you already
  −
have a likely clue about what to expect in the way of a question's outcome.
  −
 
  −
In the present Example the Canon function
  −
generates a Can file that looks like this:
  −
 
  −
Canon Output and
  −
Can File Example
  −
o-------------------o
  −
| male              |
  −
|    child          |
  −
|    boy          |
  −
|      human        |
  −
|  female          |
  −
|    child          |
  −
|    girl          |
  −
|      human        |
  −
o-------------------o
  −
 
  −
The Sense function does the extra work that is required
  −
to place the positive labels of the model tree at their
  −
proper level in the outline.
  −
 
  −
In the present Example the Sense function
  −
generates a Sen file that looks like this:
  −
 
  −
Sense Output and
  −
Sen File Example
  −
o-------------------o
  −
| male              |
  −
|  child            |
  −
|  boy            |
  −
|    human          |
  −
| female            |
  −
|  child            |
  −
|  girl            |
  −
|    human          |
  −
o-------------------o
  −
 
  −
The Canon and Sense outlines for this Example illustrate a certain
  −
type of general circumstance that needs to be noted at this point.
  −
Recall the model paths or the feature specifications that were
  −
numbered <2> and <4> in the listing of the output for Tenor.
  −
These paths, in effect, reflected Model's discovery that
  −
the venn diagram cells for male or female non-children
  −
and male or female non-humans were not excluded by
  −
the definitions that were given in the Log file.
  −
In the abstracts given by Canon and Sense, the
  −
specifications <2> and <4> have been subsumed,
  −
or absorbed unmarked, under the general topics
  −
of their respective genders, male or female.
  −
This happens because no purely positive
  −
features were supplied to distinguish
  −
the non-child and non-human cases.
  −
 
  −
That completes the discussion of
  −
this six-dimensional Example.
  −
 
  −
Nota Bene, for possible future use.  In the larger current of work
  −
with respect to which this meander of a conduit was initially both
  −
diversionary and tributary, before those high and dry regensquirm
  −
years when it turned into an intellectual interglacial oxbow lake,
  −
I once had in mind a scape in which expressions in a definitional
  −
lattice were ordered according to their simplicity on some scale
  −
or another, and in this setting the word "sense" was actually an
  −
acronym for "semantically equivalent next-simplest expression".
  −
 
  −
| If this is starting to sound a little bit familiar,
  −
| it may be because the relationship between the two
  −
| kinds of pictures of propositions, namely:
  −
|
  −
| 1.  Propositions about things in general, here,
  −
|    about the times when certain facts are true,
  −
|    having the form of functions f : X -> B,
  −
|
  −
| 2.  Propositions about binary codes, here, about
  −
|    the bit-vector labels on venn diagram cells,
  −
|    having the form of functions f' : B^k -> B,
  −
|
  −
| is an epically old story, one that I, myself,
  −
| have related one or twice upon a time before,
  −
| to wit, at least, at the following two cites:
  −
|
  −
| http://suo.ieee.org/email/msg01251.html
  −
| http://suo.ieee.org/email/msg01293.html
  −
|
  −
| There, and now here, once more, and again, it may be observed
  −
| that the relation is one whereby the proposition f : X -> B,
  −
| the one about things and times and mores in general, factors
  −
| into a coding function c : X -> B^k, followed by a derived
  −
| proposition f' : B^k -> B that judges the resulting codes.
  −
|
  −
|                        f
  −
|                  X o------>o B
  −
|                      \    ^
  −
|  c = <x_1, ..., x_k> \  / f'
  −
|                        v /
  −
|                        o
  −
|                        B^k
  −
|
  −
| You may remember that this was supposed to illustrate
  −
| the "factoring" of a proposition f : X -> B = {0, 1}
  −
| into the composition f'(c(x)), where c : X -> B^k is
  −
| the "coding" of each x in X as an k-bit string in B^k,
  −
| and where f' is the mapping of codes into a co-domain
  −
| that we interpret as t-f-values, B = {0, 1} = {F, T}.
  −
 
  −
In short, there is the standard equivocation ("systematic ambiguity"?) as to
  −
whether we are talking about the "applied" and concretely typed proposition
  −
f : X -> B or the "pure" and abstractly typed proposition f' : B^k -> B.
  −
Or we can think of the latter object as the approximate code icon of
  −
the former object.
  −
 
  −
Anyway, these types of formal objects are the sorts of things that
  −
I take to be the denotational objects of propositional expressions.
  −
These objects, along with their invarious and insundry mathematical
  −
properties, are the orders of things that I am talking about when
  −
I refer to the "invariant structures in these objects themselves".
  −
 
  −
"Invariant" means "invariant under a suitable set of transformations",
  −
in this case the translations between various languages that preserve
  −
the objects and the structures in question.  In extremest generality,
  −
this is what universal constructions in category theory are all about.
  −
 
  −
In summation, the functions f : X -> B and f' : B* -> B have invariant, formal,
  −
mathematical, objective properties that any adequate language might eventually
  −
evolve to express, only some languages express them more obscurely than others.
  −
 
  −
To be perfectly honest, I continue to be surprised that anybody in this group
  −
has trouble with this.  There are perfectly apt and familiar examples in the
  −
contrast between roman numerals and arabic numerals, or the contrast between
  −
redundant syntaxes, like those that use the pentalphabet {~, &, v, =>, <=>},
  −
and trimmer syntaxes, like those used in existential and conceptual graphs.
  −
Every time somebody says "Let's take {~, &, v, =>, <=>} as an operational
  −
basis for logic" it's just like that old joke that mathematicians tell on
  −
engineers where the ingenue in question says "1 is a prime, 2 is a prime,
  −
3 is a prime, 4 is a prime, ..." -- and I know you think that I'm being
  −
hyperbolic, but I'm really only up to parabolas here ...
  −
 
  −
I have already refined my criticism so that it does not apply to
  −
the spirit of FOL or KIF or whatever, but only to the letters of
  −
specific syntactic proposals.  There is a fact of the matter as
  −
to whether a concrete language provides a clean or a cluttered
  −
basis for representing the identified set of formal objects.
  −
And it shows up in pragmatic realities like the efficiency
  −
of real time concept formation, concept use, learnability,
  −
reasoning power, and just plain good use of real time.
  −
These are the dire consequences that I learned in my
  −
very first tries at mathematically oriented theorem
  −
automation, and the only factor that has obscured
  −
them in mainstream work since then is the speed
  −
with which folks can now do all of the same
  −
old dumb things that they used to do on
  −
their way to kludging out the answers.
  −
 
  −
It seems to be darn near impossible to explain to the
  −
centurion all of the neat stuff that he's missing by
  −
sticking to his old roman numerals.  He just keeps
  −
on reckoning that what he can't count must be of
  −
no account at all.  There is way too much stuff
  −
that these original syntaxes keep us from even
  −
beginning to discuss, like differential logic,
  −
just for starters.
  −
 
  −
Our next Example illustrates the use of the Cactus Language
  −
for representing "absolute" and "relative" partitions, also
  −
known as "complete" and "contingent" classifications of the
  −
universe of discourse, all of which amounts to divvying it
  −
up into mutually exclusive regions, exhaustive or not, as
  −
one frequently needs in situations involving a genus and
  −
its sundry species, and frequently pictures in the form
  −
of a venn diagram that looks just like a "pie chart".
  −
 
  −
Example.  Partition, Genus & Species
  −
 
  −
The idea that one needs for expressing partitions
  −
in cactus expressions can be summed up like this:
  −
 
  −
| If the propositional expression
  −
|
  −
| "( p , q , r , ... )"
  −
|
  −
| means that just one of
  −
|
  −
| p, q, r, ... is false,
  −
|
  −
| then the propositional expression
  −
|
  −
| "((p),(q),(r), ... )"
  −
|
  −
| must mean that just one of
  −
|
  −
| (p), (q), (r), ... is false,
  −
|
  −
| in other words, that just one of
  −
|
  −
| p, q, r, ... is true.
  −
 
  −
Thus we have an efficient means to express and to enforce
  −
a partition of the space of models, in effect, to maintain
  −
the condition that a number of features or propositions are
  −
to be held in mutually exclusive and exhaustive disjunction.
  −
This supplies a much needed bridge between the binary domain
  −
of two values and any other domain with a finite number of
  −
feature values.
  −
 
  −
Another variation on this theme allows one to maintain the
  −
subsumption of many separate species under an explicit genus.
  −
To see this, let us examine the following form of expression:
  −
 
  −
( q , ( q_1 ) , ( q_2 ) , ( q_3 ) ).
  −
 
  −
Now consider what it would mean for this to be true.  We see two cases:
  −
 
  −
1.  If the proposition q is true, then exactly one of the
  −
    propositions (q_1), (q_2), (q_3) must be false, and so
  −
    just one of the propositions q_1, q_2, q_3 must be true.
  −
 
  −
2.  If the proposition q is false, then every one of the
  −
    propositions (q_1), (q_2), (q_2) must be true, and so
  −
    each one of the propositions q_1, q_2, q_3 must be false.
  −
    In short, if q is false then all of the other q's are also.
  −
 
  −
Figures 1 and 2 illustrate this type of situation.
  −
 
  −
Figure 1 is the venn diagram of a 4-dimensional universe of discourse
  −
X = [q, q_1, q_2, q_3], conventionally named after the gang of four
  −
logical features that generate it.  Strictly speaking, X is made up
  −
of two layers, the position space X of abstract type %B%^4, and the
  −
proposition space X^ = (X -> %B%) of abstract type %B%^4 -> %B%,
  −
but it is commonly lawful enough to sign the signature of both
  −
spaces with the same X, and thus to give the power of attorney
  −
for the propositions to the so-indicted position space thereof.
  −
 
  −
Figure 1 also makes use of the convention whereby the regions
  −
or the subsets of the universe of discourse that correspond
  −
to the basic features q, q_1, q_2, q_3 are labelled with
  −
the parallel set of upper case letters Q, Q_1, Q_2, Q_3.
  −
 
  −
|                        o
  −
|                      / \
  −
|                      /  \
  −
|                    /    \
  −
|                    /      \
  −
|                  o        o
  −
|                  /%\      /%\
  −
|                /%%%\    /%%%\
  −
|                /%%%%%\  /%%%%%\
  −
|              /%%%%%%%\ /%%%%%%%\
  −
|              o%%%%%%%%%o%%%%%%%%%o
  −
|            / \%%%%%%%/ \%%%%%%%/ \
  −
|            /  \%%%%%/  \%%%%%/  \
  −
|          /    \%%%/    \%%%/    \
  −
|          /      \%/      \%/      \
  −
|        o        o        o        o
  −
|        / \      /%\      / \      / \
  −
|      /  \    /%%%\    /  \    /  \
  −
|      /    \  /%%%%%\  /    \  /    \
  −
|    /      \ /%%%%%%%\ /      \ /      \
  −
|    o        o%%%%%%%%%o        o        o
  −
|    ·\      / \%%%%%%%/ \      / \      /·
  −
|    · \    /  \%%%%%/  \    /  \    / ·
  −
|    ·  \  /    \%%%/    \  /    \  /  ·
  −
|    ·  \ /      \%/      \ /      \ /  ·
  −
|    ·    o        o        o        o    ·
  −
|    ·    ·\      / \      / \      /·    ·
  −
|    ·    · \    /  \    /  \    / ·    ·
  −
|    ·    ·  \  /    \  /    \  /  ·    ·
  −
|    · Q  ·  \ /      \ /      \ /  ·Q_3 ·
  −
|    ··········o        o        o··········
  −
|        ·    \      /%\      /    ·
  −
|        ·      \    /%%%\    /      ·
  −
|        ·      \  /%%%%%\  /      ·
  −
|        · Q_1    \ /%%%%%%%\ /    Q_2 ·
  −
|        ··········o%%%%%%%%%o··········
  −
|                    \%%%%%%%/
  −
|                    \%%%%%/
  −
|                      \%%%/
  −
|                      \%/
  −
|                        o
  −
|
  −
| Figure 1.  Genus Q and Species Q_1, Q_2, Q_3
  −
 
  −
Figure 2 is another form of venn diagram that one often uses,
  −
where one collapses the unindited cells and leaves only the
  −
models of the proposition in question.  Some people would
  −
call the transformation that changes from the first form
  −
to the next form an operation of "taking the quotient",
  −
but I tend to think of it as the "soap bubble picture"
  −
or more exactly the "wire & thread & soap film" model
  −
of the universe of discourse, where one pops out of
  −
consideration the sections of the soap film that
  −
stretch across the anti-model regions of space.
  −
 
  −
o-------------------------------------------------o
  −
|                                                |
  −
|  X                                              |
  −
|                                                |
  −
|                        o                        |
  −
|                      / \                      |
  −
|                      /  \                      |
  −
|                    /    \                    |
  −
|                    /      \                    |
  −
|                  /        \                  |
  −
|                  o    Q_1    o                  |
  −
|                / \        / \                |
  −
|                /  \      /  \                |
  −
|              /    \    /    \              |
  −
|              /      \  /      \              |
  −
|            /        \ /        \            |
  −
|            /          Q          \            |
  −
|          /            |            \          |
  −
|          /            |            \          |
  −
|        /      Q_2    |    Q_3      \        |
  −
|        /              |              \        |
  −
|      /                |                \      |
  −
|      o-----------------o-----------------o      |
  −
|                                                |
  −
|                                                |
  −
|                                                |
  −
o-------------------------------------------------o
  −
 
  −
Figure 2.  Genus Q and Species Q_1, Q_2, Q_3
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Example.  Partition, Genus & Species (cont.)
  −
 
  −
Last time we considered in general terms how the forms
  −
of complete partition and contingent partition operate
  −
to maintain mutually disjoint and possibly exhaustive
  −
categories of positions in a universe of discourse.
  −
 
  −
This time we contemplate another concrete Example of
  −
near minimal complexity, designed to demonstrate how
  −
the forms of partition and subsumption can interact
  −
in structuring a space of feature specifications.
  −
 
  −
In this Example, we describe a universe of discourse
  −
in terms of the following vocabulary of five features:
  −
 
  −
| L.  living_thing
  −
|
  −
| N.  non_living
  −
|
  −
| A.  animal
  −
|
  −
| V.  vegetable
  −
|
  −
| M.  mineral
  −
 
  −
Let us construe these features as being subject to four constraints:
  −
 
  −
| 1.  Everything is either a living_thing or non_living, but not both.
  −
|
  −
| 2.  Everything is either animal, vegetable, or mineral,
  −
|    but no two of these together.
  −
|
  −
| 3.  A living_thing is either animal or vegetable, but not both,
  −
|    and everything animal or vegetable is a living_thing.
  −
|
  −
| 4.  Everything mineral is non_living.
  −
 
  −
These notions and constructions are expressed in the Log file shown below:
  −
 
  −
Logical Input File
  −
o-------------------------------------------------o
  −
|                                                |
  −
|  ( living_thing , non_living )                |
  −
|                                                |
  −
|  (( animal ),( vegetable ),( mineral ))        |
  −
|                                                |
  −
|  ( living_thing ,( animal ),( vegetable ))    |
  −
|                                                |
  −
|  ( mineral ( non_living ))                    |
  −
|                                                |
  −
o-------------------------------------------------o
  −
 
  −
The cactus expression in this file is the expression
  −
of a "zeroth order theory" (ZOT), one that can be
  −
paraphrased in more ordinary language to say:
  −
 
  −
Translation
  −
o-------------------------------------------------o
  −
|                                                |
  −
|  living_thing  =/=  non_living                |
  −
|                                                |
  −
|  par : all -> {animal, vegetable, mineral}    |
  −
|                                                |
  −
|  par : living_thing -> {animal, vegetable}    |
  −
|                                                |
  −
|  mineral => non_living                        |
  −
|                                                |
  −
o-------------------------------------------------o
  −
 
  −
Here, "par : all -> {p, q, r}" is short for an assertion
  −
that the universe as a whole is partitioned into subsets
  −
that correspond to the features p, q, r.
  −
 
  −
Also, "par : q -> {r, s}" asserts that "Q partitions into R and S.
  −
 
  −
It is probably enough just to list the outputs of Model, Tenor, and Sense
  −
when run on the preceding Log file.  Using the same format and labeling as
  −
before, we may note that Model has, from 2^5 = 32 possible interpretations,
  −
made 11 evaluations, and found 3 models answering the generic descriptions
  −
that were imposed by the logical input file.
  −
 
  −
Model Outline
  −
o------------------------o
  −
| living_thing          |
  −
|  non_living -          |  1
  −
|  (non_living )        |
  −
|  mineral -            |  2
  −
|  (mineral )          |
  −
|    animal              |
  −
|    vegetable -        |  3
  −
|    (vegetable ) *    |  4 *
  −
|    (animal )          |
  −
|    vegetable *        |  5 *
  −
|    (vegetable ) -    |  6
  −
| (living_thing )        |
  −
|  non_living            |
  −
|  animal -            |  7
  −
|  (animal )            |
  −
|    vegetable -        |  8
  −
|    (vegetable )        |
  −
|    mineral *          |  9 *
  −
|    (mineral ) -      | 10
  −
|  (non_living ) -      | 11
  −
o------------------------o
  −
 
  −
Tenor Outline
  −
o------------------------o
  −
| living_thing          |
  −
|  (non_living )        |
  −
|  (mineral )          |
  −
|    animal              |
  −
|    (vegetable ) *    | <1>
  −
|    (animal )          |
  −
|    vegetable *        | <2>
  −
| (living_thing )        |
  −
|  non_living            |
  −
|  (animal )            |
  −
|    (vegetable )        |
  −
|    mineral *          | <3>
  −
o------------------------o
  −
 
  −
Sense Outline
  −
o------------------------o
  −
| living_thing          |
  −
|  animal                |
  −
|  vegetable            |
  −
| non_living            |
  −
|  mineral              |
  −
o------------------------o
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Example.  Molly's World
  −
 
  −
I think that we are finally ready to tackle a more respectable example.
  −
The Example known as "Molly's World" is borrowed from the literature on
  −
computational learning theory, adapted with a few changes from the example
  −
called "Molly’s Problem" in the paper "Learning With Hints" by Dana Angluin.
  −
By way of setting up the problem, I quote Angluin's motivational description:
  −
 
  −
| Imagine that you have become acquainted with an alien named Molly from the
  −
| planet Ornot, who is currently employed in a day-care center.  She is quite
  −
| good at propositional logic, but a bit weak on knowledge of Earth.  So you
  −
| decide to formulate the beginnings of a propositional theory to help her
  −
| label things in her immediate environment.
  −
|
  −
| Angluin, Dana, "Learning With Hints", pages 167-181, in:
  −
| David Haussler & Leonard Pitt (eds.), 'Proceedings of the 1988 Workshop
  −
| on Computational Learning Theory', Morgan Kaufmann, San Mateo, CA, 1989.
  −
 
  −
The purpose of this quaint pretext is, of course, to make sure that the
  −
reader appreciates the constraints of the problem:  that no extra savvy
  −
is fair, all facts must be presumed or deduced on the immediate premises.
  −
 
  −
My use of this example is not directly relevant to the purposes of the
  −
discussion from which it is taken, so I simply give my version of it
  −
without comment on those issues.
  −
 
  −
Here is my rendition of the initial knowledge base delimiting Molly’s World:
  −
 
  −
Logical Input File:  Molly.Log
  −
o---------------------------------------------------------------------o
  −
|                                                                    |
  −
| ( object ,( toy ),( vehicle ))                                      |
  −
| (( small_size ),( medium_size ),( large_size ))                    |
  −
| (( two_wheels ),( three_wheels ),( four_wheels ))                  |
  −
| (( no_seat ),( one_seat ),( few_seats ),( many_seats ))            |
  −
| ( object ,( scooter ),( bike ),( trike ),( car ),( bus ),( wagon )) |
  −
| ( two_wheels    no_seat            ,( scooter ))                    |
  −
| ( two_wheels    one_seat    pedals ,( bike ))                      |
  −
| ( three_wheels  one_seat    pedals ,( trike ))                      |
  −
| ( four_wheels  few_seats  doors  ,( car ))                        |
  −
| ( four_wheels  many_seats  doors  ,( bus ))                        |
  −
| ( four_wheels  no_seat    handle ,( wagon ))                      |
  −
| ( scooter          ( toy  small_size ))                            |
  −
| ( wagon            ( toy  small_size ))                            |
  −
| ( trike            ( toy  small_size ))                            |
  −
| ( bike  small_size  ( toy ))                                        |
  −
| ( bike  medium_size ( vehicle ))                                    |
  −
| ( bike  large_size  )                                              |
  −
| ( car              ( vehicle  large_size ))                        |
  −
| ( bus              ( vehicle  large_size ))                        |
  −
| ( toy              ( object ))                                    |
  −
| ( vehicle          ( object ))                                    |
  −
|                                                                    |
  −
o---------------------------------------------------------------------o
  −
 
  −
All of the logical forms that are used in the preceding Log file
  −
will probably be familiar from earlier discussions.  The purpose
  −
of one or two constructions may, however, be a little obscure,
  −
so I will insert a few words of additional explanation here:
  −
 
  −
The rule "( bike large_size )", for example, merely
  −
says that nothing can be both a bike and large_size.
  −
 
  −
The rule "( three_wheels one_seat pedals ,( trike ))" says that anything
  −
with all the features of three_wheels, one_seat, and pedals is excluded
  −
from being anything but a trike.  In short, anything with just those
  −
three features is equivalent to a trike.
  −
 
  −
Recall that the form "( p , q )" may be interpreted to assert either
  −
the exclusive disjunction or the logical inequivalence of p and q.
  −
 
  −
The rules have been stated in this particular way simply
  −
to imitate the style of rules in the reference example.
  −
 
  −
This last point does bring up an important issue, the question
  −
of "rhetorical" differences in expression and their potential
  −
impact on the "pragmatics" of computation.  Unfortunately,
  −
I will have to abbreviate my discussion of this topic for
  −
now, and only mention in passing the following facts.
  −
 
  −
Logically equivalent expressions, even though they must lead
  −
to logically equivalent normal forms, may have very different
  −
characteristics when it comes to the efficiency of processing.
  −
 
  −
For instance, consider the following four forms:
  −
 
  −
| 1.  (( p , q ))
  −
|
  −
| 2.  ( p ,( q ))
  −
|
  −
| 3.  (( p ), q )
  −
|
  −
| 4.  (( p , q ))
  −
 
  −
All of these are equally succinct ways of maintaining that
  −
p is logically equivalent to q, yet each can have different
  −
effects on the route that Model takes to arrive at an answer.
  −
Apparently, some equalities are more equal than others.
  −
 
  −
These effects occur partly because the algorithm chooses to make cases
  −
of variables on a basis of leftmost shallowest first, but their impact
  −
can be complicated by the interactions that each expression has with
  −
the context that it occupies.  The main lesson to take away from all
  −
of this, at least, for the time being, is that it is probably better
  −
not to bother too much about these problems, but just to experiment
  −
with different ways of expressing equivalent pieces of information
  −
until you get a sense of what works best in various situations.
  −
 
  −
I think that you will be happy to see only the
  −
ultimate Sense of Molly’s World, so here it is:
  −
 
  −
Sense Outline:  Molly.Sen
  −
o------------------------o
  −
| object                |
  −
|  two_wheels            |
  −
|  no_seat              |
  −
|    scooter            |
  −
|    toy                |
  −
|      small_size        |
  −
|  one_seat            |
  −
|    pedals              |
  −
|    bike              |
  −
|      small_size        |
  −
|      toy              |
  −
|      medium_size      |
  −
|      vehicle          |
  −
|  three_wheels          |
  −
|  one_seat            |
  −
|    pedals              |
  −
|    trike              |
  −
|      toy              |
  −
|      small_size      |
  −
|  four_wheels          |
  −
|  few_seats            |
  −
|    doors              |
  −
|    car                |
  −
|      vehicle          |
  −
|      large_size      |
  −
|  many_seats          |
  −
|    doors              |
  −
|    bus                |
  −
|      vehicle          |
  −
|      large_size      |
  −
|  no_seat              |
  −
|    handle              |
  −
|    wagon              |
  −
|      toy              |
  −
|      small_size      |
  −
o------------------------o
  −
 
  −
This outline is not the Sense of the unconstrained Log file,
  −
but the result of running Model with a query on the single
  −
feature "object".  Using this focus helps the Modeler
  −
to make more relevant Sense of Molly’s World.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
DM = Douglas McDavid
  −
 
  −
DM: This, again, is an example of how real issues of ontology are
  −
    so often trivialized at the expense of technicalities.  I just
  −
    had a burger, some fries, and a Coke.  I would say all that was
  −
    non-living and non-mineral.  A virus, I believe is non-animal,
  −
    non-vegetable, but living (and non-mineral).  Teeth, shells,
  −
    and bones are virtually pure mineral, but living.  These are
  −
    the kinds of issues that are truly "ontological," in my
  −
    opinion.  You are not the only one to push them into
  −
    the background as of lesser importance.  See the
  −
    discussion of "18-wheelers" in John Sowa's book.
  −
 
  −
it's not my example, and from you say, it's not your example either.
  −
copied it out of a book or a paper somewhere, too long ago to remember.
  −
i am assuming that the author or tardition from which it came must have
  −
seen some kind of sense in it.  tell you what, write out your own theory
  −
of "what is" in so many variables, more or less, publish it in a book or
  −
a paper, and then folks will tell you that they dispute each and every
  −
thing that you have just said, and it won't really matter all that much
  −
how complex it is or how subtle you are.  that has been the way of all
  −
ontology for about as long as anybody can remember or even read about.
  −
me?  i don't have sufficient arrogance to be an ontologist, and you
  −
know that's saying a lot, as i can't even imagine a way to convince
  −
myself that i believe i know "what is", really and truly for sure
  −
like some folks just seem to do.  so i am working to improve our
  −
technical ability to do logic, which is mostly a job of shooting
  −
down the more serious delusions that we often get ourselves into.
  −
can i be of any use to ontologists?  i dunno.  i guess it depends
  −
on how badly they are attached to some of the delusions of knowing
  −
what their "common" sense tells them everybody ought to already know,
  −
but that every attempt to check that out in detail tells them it just
  −
ain't so.  a problem for which denial was just begging to be invented,
  −
and so it was.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Example.  Molly's World (cont.)
  −
 
  −
In preparation for a contingently possible future discussion,
  −
I need to attach a few parting thoughts to the case workup
  −
of Molly's World that may not seem terribly relevant to
  −
the present setting, but whose pertinence I hope will
  −
become clearer in time.
  −
 
  −
The logical paradigm from which this Example was derived is that
  −
of "Zeroth Order Horn Clause Theories".  The clauses at issue
  −
in these theories are allowed to be of just three kinds:
  −
 
  −
| 1.  p & q & r & ... => z
  −
|
  −
| 2.  z
  −
|
  −
| 3.  ~[p & q & r & ...]
  −
 
  −
Here, the proposition letters "p", "q", "r", ..., "z"
  −
are restricted to being single positive features, not
  −
themselves negated or otherwise complex expressions.
  −
 
  −
In the Cactus Language or Existential Graph syntax
  −
these forms would take on the following appearances:
  −
 
  −
| 1.  ( p q r ... ( z ))
  −
|
  −
| 2.    z
  −
|
  −
| 3.  ( p q r ... )
  −
 
  −
The style of deduction in Horn clause logics is essentially
  −
proof-theoretic in character, with the main burden of proof
  −
falling on implication relations ("=>") and on "projective"
  −
forms of inference, that is, information-losing inferences
  −
like modus ponens and resolution.  Cf. [Llo], [MaW].
  −
 
  −
In contrast, the method used here is substantially model-theoretic,
  −
the stress being to start from more general forms of expression for
  −
laying out facts (for example, distinctions, equations, partitions)
  −
and to work toward results that maintain logical equivalence with
  −
their origins.
  −
 
  −
What all of this has to do with the output above is this:
  −
>From the perspective that is adopted in the present work,
  −
almost any theory, for example, the one that is founded
  −
on the postulates of Molly's World, will have far more
  −
models than the implicational and inferential mode of
  −
reasoning is designed to discover.  We will be forced
  −
to confront them, however, if we try to run Model on
  −
a large set of implications.
  −
 
  −
The typical Horn clause interpreter gets around this
  −
difficulty only by a stratagem that takes clauses to
  −
mean something other than what they say, that is, by
  −
distorting the principles of semantics in practice.
  −
Our Model, on the other hand, has no such finesse.
  −
 
  −
This explains why it was necessary to impose the
  −
prerequisite "object" constraint on the Log file
  −
for Molly's World.  It supplied no more than what
  −
we usually take for granted, in order to obtain
  −
a set of models that we would normally think of
  −
as being the intended import of the definitions.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Example.  Jets & Sharks
  −
 
  −
The propositional calculus based on the boundary operator, that is,
  −
the multigrade logical connective of the form "( , , , ... )" can be
  −
interpreted in a way that resembles the logic of activation states and
  −
competition constraints in certain neural network models.  One way to do
  −
this is by interpreting the blank or unmarked state as the resting state
  −
of a neural pool, the bound or marked state as its activated state, and
  −
by representing a mutually inhibitory pool of neurons p, q, r by means
  −
of the expression "( p , q , r )".  To illustrate this possibility,
  −
I transcribe into cactus language expressions a notorious example
  −
from the "parallel distributed processing" (PDP) paradigm [McR]
  −
and work through two of the associated exercises as portrayed
  −
in this format.
  −
 
  −
Logical Input File:  JAS  =  ZOT(Jets And Sharks)
  −
o----------------------------------------------------------------o
  −
|                                                                |
  −
|  (( art    ),( al  ),( sam  ),( clyde ),( mike  ),            |
  −
|  ( jim    ),( greg ),( john ),( doug  ),( lance ),            |
  −
|  ( george ),( pete ),( fred ),( gene  ),( ralph ),            |
  −
|  ( phil  ),( ike  ),( nick ),( don  ),( ned  ),( karl ),  |
  −
|  ( ken    ),( earl ),( rick ),( ol    ),( neal  ),( dave ))  |
  −
|                                                                |
  −
|  ( jets , sharks )                                            |
  −
|                                                                |
  −
|  ( jets ,                                                      |
  −
|    ( art    ),( al  ),( sam  ),( clyde ),( mike  ),          |
  −
|    ( jim    ),( greg ),( john ),( doug  ),( lance ),          |
  −
|    ( george ),( pete ),( fred ),( gene  ),( ralph ))          |
  −
|                                                                |
  −
|  ( sharks ,                                                    |
  −
|    ( phil ),( ike  ),( nick ),( don ),( ned  ),( karl ),      |
  −
|    ( ken  ),( earl ),( rick ),( ol  ),( neal ),( dave ))      |
  −
|                                                                |
  −
|  (( 20's ),( 30's ),( 40's ))                                  |
  −
|                                                                |
  −
|  ( 20's ,                                                      |
  −
|    ( sam    ),( jim  ),( greg ),( john ),( lance ),            |
  −
|    ( george ),( pete ),( fred ),( gene ),( ken  ))            |
  −
|                                                                |
  −
|  ( 30's ,                                                      |
  −
|    ( al  ),( mike ),( doug ),( ralph ),                      |
  −
|    ( phil ),( ike  ),( nick ),( don  ),                      |
  −
|    ( ned  ),( rick ),( ol  ),( neal  ),( dave ))              |
  −
|                                                                |
  −
|  ( 40's ,                                                      |
  −
|    ( art ),( clyde ),( karl ),( earl ))                        |
  −
|                                                                |
  −
|  (( junior_high ),( high_school ),( college ))                |
  −
|                                                                |
  −
|  ( junior_high ,                                              |
  −
|    ( art  ),( al    ),( clyde  ),( mike  ),( jim ),            |
  −
|    ( john ),( lance ),( george ),( ralph ),( ike ))            |
  −
|                                                                |
  −
|  ( high_school ,                                              |
  −
|    ( greg ),( doug ),( pete ),( fred ),( nick ),              |
  −
|    ( karl ),( ken  ),( earl ),( rick ),( neal ),( dave ))      |
  −
|                                                                |
  −
|  ( college ,                                                  |
  −
|    ( sam ),( gene ),( phil ),( don ),( ned ),( ol ))          |
  −
|                                                                |
  −
|  (( single ),( married ),( divorced ))                        |
  −
|                                                                |
  −
|  ( single ,                                                    |
  −
|    ( art  ),( sam  ),( clyde ),( mike ),                      |
  −
|    ( doug  ),( pete ),( fred  ),( gene ),                      |
  −
|    ( ralph ),( ike  ),( nick  ),( ken  ),( neal ))            |
  −
|                                                                |
  −
|  ( married ,                                                  |
  −
|    ( al  ),( greg ),( john ),( lance ),( phil ),              |
  −
|    ( don ),( ned  ),( karl ),( earl  ),( ol  ))              |
  −
|                                                                |
  −
|  ( divorced ,                                                  |
  −
|    ( jim ),( george ),( rick ),( dave ))                      |
  −
|                                                                |
  −
|  (( bookie ),( burglar ),( pusher ))                          |
  −
|                                                                |
  −
|  ( bookie ,                                                    |
  −
|    ( sam  ),( clyde ),( mike ),( doug ),                      |
  −
|    ( pete ),( ike  ),( ned  ),( karl ),( neal ))              |
  −
|                                                                |
  −
|  ( burglar ,                                                  |
  −
|    ( al    ),( jim ),( john ),( lance ),                      |
  −
|    ( george ),( don ),( ken  ),( earl  ),( rick ))            |
  −
|                                                                |
  −
|  ( pusher ,                                                    |
  −
|    ( art  ),( greg ),( fred ),( gene ),                      |
  −
|    ( ralph ),( phil ),( nick ),( ol  ),( dave ))              |
  −
|                                                                |
  −
o----------------------------------------------------------------o
  −
 
  −
We now apply Study to the proposition that
  −
defines the Jets and Sharks knowledge base,
  −
that is to say, the knowledge that we are
  −
given about the Jets and Sharks, not the
  −
knowledge that the Jets and Sharks have.
  −
 
  −
With a query on the name "ken" we obtain the following
  −
output, giving all of the features associated with Ken:
  −
 
  −
Sense Outline:  JAS & Ken
  −
o---------------------------------------o
  −
| ken                                  |
  −
|  sharks                              |
  −
|  20's                                |
  −
|    high_school                        |
  −
|    single                            |
  −
|      burglar                          |
  −
o---------------------------------------o
  −
 
  −
With a query on the two features "college" and "sharks"
  −
we obtain the following outline of all of the features
  −
that satisfy these constraints:
  −
 
  −
Sense Outline:  JAS & College & Sharks
  −
o---------------------------------------o
  −
| college                              |
  −
|  sharks                              |
  −
|  30's                                |
  −
|    married                            |
  −
|    bookie                            |
  −
|      ned                              |
  −
|    burglar                          |
  −
|      don                              |
  −
|    pusher                            |
  −
|      phil                            |
  −
|      ol                              |
  −
o---------------------------------------o
  −
 
  −
>From this we discover that all college Sharks
  −
are 30-something and married.  Furthermore,
  −
we have a complete listing of their names
  −
broken down by occupation, as I have no
  −
doubt that all of them will be in time.
  −
 
  −
| Reference:
  −
|
  −
| McClelland, James L. & Rumelhart, David E.,
  −
|'Explorations in Parallel Distributed Processing:
  −
| A Handbook of Models, Programs, and Exercises',
  −
| MIT Press, Cambridge, MA, 1988.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
One of the issues that my pondering weak and weary over
  −
has caused me to burn not a few barrels of midnight oil
  −
over the past elventeen years or so is the relationship
  −
among divers and sundry "styles of inference", by which
  −
I mean particular choices of inference paradigms, rules,
  −
or schemata.  The chief breakpoint seems to lie between
  −
information-losing and information-maintaining modes of
  −
inference, also called "implicational" and "equational",
  −
or "projective" and "preservative" brands, respectively.
  −
 
  −
Since it appears to be mostly the implicational and projective
  −
styles of inference that are more familiar to folks hereabouts,
  −
I will start off this subdiscussion by introducing a number of
  −
risibly simple but reasonably manageable examples of the other
  −
brand of inference, treated as equational reasoning approaches
  −
to problems about satisfying "zeroth order constraints" (ZOC's).
  −
 
  −
Applications of a Propositional Calculator:
  −
Constraint Satisfaction Problems.
  −
Jon Awbrey, April 24, 1995.
  −
 
  −
The Four Houses Puzzle
  −
 
  −
Constructed on the model of the "Five Houses Puzzle" in [VaH, 132-136].
  −
 
  −
Problem Statement.  Four people with different nationalities live in the
  −
first four houses of a street.  They practice four distinct professions,
  −
and each of them has a favorite animal, all of them different.  The four
  −
houses are painted different colors.  The following facts are known:
  −
 
  −
|  1.  The Englander lives in the first house on the left.
  −
|  2.  The doctor lives in the second house.
  −
|  3.  The third house is painted red.
  −
|  4.  The zebra is a favorite in the fourth house.
  −
|  5.  The person in the first house has a dog.
  −
|  6.  The Japanese lives in the third house.
  −
|  7.  The red house is on the left of the yellow one.
  −
|  8.  They breed snails in the house to right of the doctor.
  −
|  9.  The Englander lives next to the green house.
  −
| 10.  The fox is in the house next to to the diplomat.
  −
| 11.  The Spaniard likes zebras.
  −
| 12.  The Japanese is a painter.
  −
| 13.  The Italian lives in the green house.
  −
| 14.  The violinist lives in the yellow house.
  −
| 15.  The dog is a pet in the blue house.
  −
| 16.  The doctor keeps a fox.
  −
 
  −
The problem is to find all of the assignments of
  −
features to houses that satisfy these requirements.
  −
 
  −
Logical Input File:  House^4.Log
  −
o---------------------------------------------------------------------o
  −
|                                                                    |
  −
|  eng_1  doc_2  red_3  zeb_4  dog_1  jap_3                          |
  −
|                                                                    |
  −
|  (( red_1  yel_2 ),( red_2  yel_3 ),( red_3  yel_4 ))              |
  −
|  (( doc_1  sna_2 ),( doc_2  sna_3 ),( doc_3  sna_4 ))              |
  −
|                                                                    |
  −
|  (( eng_1  gre_2 ),                                                |
  −
|  ( eng_2  gre_3 ),( eng_2  gre_1 ),                                |
  −
|  ( eng_3  gre_4 ),( eng_3  gre_2 ),                                |
  −
|                    ( eng_4  gre_3 ))                                |
  −
|                                                                    |
  −
|  (( dip_1  fox_2 ),                                                |
  −
|  ( dip_2  fox_3 ),( dip_2  fox_1 ),                                |
  −
|  ( dip_3  fox_4 ),( dip_3  fox_2 ),                                |
  −
|                    ( dip_4  fox_3 ))                                |
  −
|                                                                    |
  −
|  (( spa_1 zeb_1 ),( spa_2 zeb_2 ),( spa_3 zeb_3 ),( spa_4 zeb_4 ))  |
  −
|  (( jap_1 pai_1 ),( jap_2 pai_2 ),( jap_3 pai_3 ),( jap_4 pai_4 ))  |
  −
|  (( ita_1 gre_1 ),( ita_2 gre_2 ),( ita_3 gre_3 ),( ita_4 gre_4 ))  |
  −
|                                                                    |
  −
|  (( yel_1 vio_1 ),( yel_2 vio_2 ),( yel_3 vio_3 ),( yel_4 vio_4 ))  |
  −
|  (( blu_1 dog_1 ),( blu_2 dog_2 ),( blu_3 dog_3 ),( blu_4 dog_4 ))  |
  −
|                                                                    |
  −
|  (( doc_1 fox_1 ),( doc_2 fox_2 ),( doc_3 fox_3 ),( doc_4 fox_4 ))  |
  −
|                                                                    |
  −
|  ((                                                                |
  −
|                                                                    |
  −
|  (( eng_1 ),( eng_2 ),( eng_3 ),( eng_4 ))                          |
  −
|  (( spa_1 ),( spa_2 ),( spa_3 ),( spa_4 ))                          |
  −
|  (( jap_1 ),( jap_2 ),( jap_3 ),( jap_4 ))                          |
  −
|  (( ita_1 ),( ita_2 ),( ita_3 ),( ita_4 ))                          |
  −
|                                                                    |
  −
|  (( eng_1 ),( spa_1 ),( jap_1 ),( ita_1 ))                          |
  −
|  (( eng_2 ),( spa_2 ),( jap_2 ),( ita_2 ))                          |
  −
|  (( eng_3 ),( spa_3 ),( jap_3 ),( ita_3 ))                          |
  −
|  (( eng_4 ),( spa_4 ),( jap_4 ),( ita_4 ))                          |
  −
|                                                                    |
  −
|  (( gre_1 ),( gre_2 ),( gre_3 ),( gre_4 ))                          |
  −
|  (( red_1 ),( red_2 ),( red_3 ),( red_4 ))                          |
  −
|  (( yel_1 ),( yel_2 ),( yel_3 ),( yel_4 ))                          |
  −
|  (( blu_1 ),( blu_2 ),( blu_3 ),( blu_4 ))                          |
  −
|                                                                    |
  −
|  (( gre_1 ),( red_1 ),( yel_1 ),( blu_1 ))                          |
  −
|  (( gre_2 ),( red_2 ),( yel_2 ),( blu_2 ))                          |
  −
|  (( gre_3 ),( red_3 ),( yel_3 ),( blu_3 ))                          |
  −
|  (( gre_4 ),( red_4 ),( yel_4 ),( blu_4 ))                          |
  −
|                                                                    |
  −
|  (( pai_1 ),( pai_2 ),( pai_3 ),( pai_4 ))                          |
  −
|  (( dip_1 ),( dip_2 ),( dip_3 ),( dip_4 ))                          |
  −
|  (( vio_1 ),( vio_2 ),( vio_3 ),( vio_4 ))                          |
  −
|  (( doc_1 ),( doc_2 ),( doc_3 ),( doc_4 ))                          |
  −
|                                                                    |
  −
|  (( pai_1 ),( dip_1 ),( vio_1 ),( doc_1 ))                          |
  −
|  (( pai_2 ),( dip_2 ),( vio_2 ),( doc_2 ))                          |
  −
|  (( pai_3 ),( dip_3 ),( vio_3 ),( doc_3 ))                          |
  −
|  (( pai_4 ),( dip_4 ),( vio_4 ),( doc_4 ))                          |
  −
|                                                                    |
  −
|  (( dog_1 ),( dog_2 ),( dog_3 ),( dog_4 ))                          |
  −
|  (( zeb_1 ),( zeb_2 ),( zeb_3 ),( zeb_4 ))                          |
  −
|  (( fox_1 ),( fox_2 ),( fox_3 ),( fox_4 ))                          |
  −
|  (( sna_1 ),( sna_2 ),( sna_3 ),( sna_4 ))                          |
  −
|                                                                    |
  −
|  (( dog_1 ),( zeb_1 ),( fox_1 ),( sna_1 ))                          |
  −
|  (( dog_2 ),( zeb_2 ),( fox_2 ),( sna_2 ))                          |
  −
|  (( dog_3 ),( zeb_3 ),( fox_3 ),( sna_3 ))                          |
  −
|  (( dog_4 ),( zeb_4 ),( fox_4 ),( sna_4 ))                          |
  −
|                                                                    |
  −
|  ))                                                                |
  −
|                                                                    |
  −
o---------------------------------------------------------------------o
  −
 
  −
Sense Outline:  House^4.Sen
  −
o-----------------------------o
  −
| eng_1                      |
  −
|  doc_2                      |
  −
|  red_3                    |
  −
|    zeb_4                    |
  −
|    dog_1                  |
  −
|      jap_3                  |
  −
|      yel_4                |
  −
|        sna_3                |
  −
|        gre_2              |
  −
|          dip_1              |
  −
|          fox_2            |
  −
|            spa_4            |
  −
|            pai_3          |
  −
|              ita_2          |
  −
|              vio_4        |
  −
|                blu_1        |
  −
o-----------------------------o
  −
 
  −
Table 1.  Solution to the Four Houses Puzzle
  −
o------------o------------o------------o------------o------------o
  −
|            | House 1    | House 2    | House 3    | House 4    |
  −
o------------o------------o------------o------------o------------o
  −
| Nation    | England    | Italy      | Japan      | Spain      |
  −
| Color      | blue      | green      | red        | yellow    |
  −
| Profession | diplomat  | doctor    | painter    | violinist  |
  −
| Animal    | dog        | fox        | snails    | zebra      |
  −
o------------o------------o------------o------------o------------o
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
First off, I do not trivialize the "real issues of ontology", indeed,
  −
it is precisely my estimate of the non-trivial difficulty of this task,
  −
of formulating the types of "generic ontology" that we propose to do here,
  −
that forces me to choose and to point out the inescapability of the approach
  −
that I am currently taking, which is to enter on the necessary preliminary of
  −
building up the logical tools that we need to tackle the ontology task proper.
  −
And I would say, to the contrary, that it is those who think we can arrive at
  −
a working general ontology by sitting on the porch shooting the breeze about
  −
"what it is" until the cows come home -- that is, the method for which it
  −
has become cliche to indict the Ancient Greeks, though, if truth be told,
  −
we'd have to look to the pre-socratics and the pre-stoics to find a good
  −
match for the kinds of revelation that are common hereabouts -- I would
  −
say that it's those folks who trivialize the "real issues of ontology".
  −
 
  −
A person, living in our times, who is serious about knowing the being of things,
  −
really only has one choice -- to pick what tiny domain of things he or she just
  −
has to know about the most, thence to hie away to the adept gurus of the matter
  −
in question, forgeting the rest, cause "general ontology" is a no-go these days.
  −
It is presently in a state like astronomy before telescopes, and that means not
  −
entirely able to discern itself from astrology and other psychically projective
  −
exercises of wishful and dreadful thinking like that.
  −
 
  −
So I am busy grinding lenses ...
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
DM = Douglas McDavid
  −
 
  −
DM: Thanks for both the original and additional response.  I'm not trying to
  −
    single you out, as I have been picking  on various postings in a similar
  −
    manner ever since I started contributing to this discussion.  I agree with
  −
    you that the task of this working group is non-trivially difficult.  In fact,
  −
    I believe we are still a long way from a clear and useful agreement about what
  −
    constitutes "upper" ontology, and what it would mean to standardize it.  However,
  −
    I don't agree that the only place to make progress is in tiny domains of things.
  −
    I've contributed the thought that a fundamental, upper-level concept is the
  −
    concept of system, and that that would be a good place to begin.  And I'll
  −
    never be able to refrain from evaluating the content as well as the form
  −
    of any examples presented for consideration here.  Probably should
  −
    accompany these comments with a ;-)
  −
 
  −
There will never be a standard universal ontology
  −
of the absolute essential impertubable monolithic
  −
variety that some people still dream of in their
  −
fantasies of spectating on and speculating about
  −
a pre-relativistically non-participatory universe
  −
from their singular but isolated gods'eye'views.
  −
The bells tolled for that one many years ago,
  −
but some of the more blithe of the blissful
  −
islanders have just not gotten the news yet.
  −
 
  −
But there is still a lot to do that would be useful
  −
under the banner of a "standard upper ontology",
  −
if only we stay loose in our interpretation
  −
of what that implies in practical terms.
  −
 
  −
One likely approach to the problem would be to take
  −
a hint from the afore-allusioned history of physics --
  −
to inquire for whom, else, the bell tolls -- and to
  −
see if there are any bits of wisdom from that prior
  −
round of collective experience that can be adapted
  −
by dint of analogy to our present predicament.
  −
I happen to think that there are.
  −
 
  −
And there the answer was, not to try and force a return,
  −
though lord knows they all gave it their very best shot,
  −
to an absolute and imperturbable framework of existence,
  −
but to see the reciprocal participant relation that all
  −
partakers have to the constitution of that framing, yes,
  −
even unto those who would abdictators and abstainees be.
  −
 
  −
But what does that imply about some shred of a standard?
  −
It means that we are better off seeking, not a standard,
  −
one-size-fits-all ontology, but more standard resources
  −
for trying to interrelate diverse points of view and to
  −
transform the data that's gathered from one perspective
  −
in ways that it can most appropriately be compared with
  −
the data that is gathered from other standpoints on the
  −
splendorous observational scenes and theorematic stages.
  −
 
  −
That is what I am working on.
  −
And it hasn't been merely
  −
for a couple of years.
  −
 
  −
As to this bit:
  −
 
  −
o-------------------------------------------------o
  −
|                                                |
  −
|  ( living_thing , non_living )                |
  −
|                                                |
  −
|  (( animal ),( vegetable ),( mineral ))        |
  −
|                                                |
  −
|  ( living_thing ,( animal ),( vegetable ))    |
  −
|                                                |
  −
|  ( mineral ( non_living ))                    |
  −
|                                                |
  −
o-------------------------------------------------o
  −
 
  −
My 5-dimensional Example, that I borrowed from some indifferent source
  −
of what is commonly recognized as "common sense" -- and I think rather
  −
obviously designed more for the classification of pre-modern species
  −
of whole critters and pure matters of natural substance than the
  −
motley mixture of un/natural and in/organic conglouterites that
  −
we find served up on the menu of modernity -- was not intended
  −
even so much as a toy ontology, but simply as an expository
  −
example, concocted for the sake of illustrating the sorts
  −
of logical interaction that occur among four different
  −
patterns of logical constraint, all of which types
  −
arise all the time no matter what the domain, and
  −
which I believe that my novel forms of expression,
  −
syntactically speaking, express quite succinctly,
  −
especially when you contemplate the complexities
  −
of the computation that may flow and must follow
  −
from even these meagre propositional expressions.
  −
 
  −
Yes, systems -- but -- even here usage differs in significant ways.
  −
I have spent ten years now trying to integrate my earlier efforts
  −
under an explicit systems banner, but even within the bounds of
  −
a systems engineering programme at one site there is a wide
  −
semantic dispersion that issues from this word "system".
  −
I am committed, and in writing, to taking what we so
  −
glibly and prospectively call "intelligent systems"
  −
seriously as dynamical systems.  That has many
  −
consequences, and I have to pick and choose
  −
which of those I may be suited to follow.
  −
 
  −
But that is too long a story for now ...
  −
 
  −
";-)"?
  −
 
  −
Somehow that has always looked like
  −
the Chesshire Cat's grin to me ...
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
By way of catering to popular demand, I have decided to
  −
render this symposium a bit more à la carte, and thus to
  −
serve up as faster food than heretofore a choice selection
  −
of the more sumptuous bits that I have in my logical larder,
  −
not yet full fare, by any means, but a sample of what might
  −
one day approach to being an abundantly moveable feast of
  −
ontological contents and general metaphysical delights.
  −
I'll leave it to you to name your poison, as it were.
  −
 
  −
Applications of a Propositional Calculator:
  −
Constraint Satisfaction Problems.
  −
Jon Awbrey, April 24, 1995.
  −
 
  −
Fabric Knowledge Base
  −
Based on the example in [MaW, pages 8-16].
  −
 
  −
Logical Input File:  Fab.Log
  −
o---------------------------------------------------------------------o
  −
|                                                                    |
  −
| (has_floats , plain_weave )                                        |
  −
| (has_floats ,(twill_weave ),(satin_weave ))                        |
  −
|                                                                    |
  −
| (plain_weave ,                                                      |
  −
|  (plain_weave  one_color ),                                        |
  −
|  (color_groups  ),                                                  |
  −
|  (grouped_warps ),                                                  |
  −
|  (some_thicker  ),                                                  |
  −
|  (crossed_warps ),                                                  |
  −
|  (loop_threads  ),                                                  |
  −
|  (plain_weave  flannel ))                                          |
  −
|                                                                    |
  −
| (plain_weave  one_color  cotton  balanced  smooth  ,(percale ))    |
  −
| (plain_weave  one_color  cotton            sheer  ,(organdy ))    |
  −
| (plain_weave  one_color  silk              sheer  ,(organza ))    |
  −
|                                                                    |
  −
| (plain_weave  color_groups  warp_stripe  fill_stripe ,(plaid  ))  |
  −
| (plaid        equal_stripe                          ,(gingham ))  |
  −
|                                                                    |
  −
| (plain_weave  grouped_warps ,(basket_weave ))                      |
  −
|                                                                    |
  −
| (basket_weave  typed ,                                              |
  −
|  (type_2_to_1 ),                                                    |
  −
|  (type_2_to_2 ),                                                    |
  −
|  (type_4_to_4 ))                                                    |
  −
|                                                                    |
  −
| (basket_weave  typed  type_2_to_1  thicker_fill  ,(oxford      )) |
  −
| (basket_weave  typed (type_2_to_2 ,                                |
  −
|                      type_4_to_4 ) same_thickness ,(monks_cloth )) |
  −
| (basket_weave (typed )              rough  open    ,(hopsacking  )) |
  −
|                                                                    |
  −
| (typed (basket_weave ))                                            |
  −
|                                                                    |
  −
| (basket_weave ,(oxford ),(monks_cloth ),(hopsacking ))              |
  −
|                                                                    |
  −
| (plain_weave  some_thicker ,(ribbed_weave ))                      |
  −
|                                                                    |
  −
| (ribbed_weave ,(small_rib ),(medium_rib ),(heavy_rib ))            |
  −
| (ribbed_weave ,(flat_rib  ),(round_rib ))                          |
  −
|                                                                    |
  −
| (ribbed_weave  thicker_fill          ,(cross_ribbed ))              |
  −
| (cross_ribbed  small_rib  flat_rib  ,(faille      ))              |
  −
| (cross_ribbed  small_rib  round_rib ,(grosgrain    ))              |
  −
| (cross_ribbed  medium_rib  round_rib ,(bengaline    ))              |
  −
| (cross_ribbed  heavy_rib  round_rib ,(ottoman      ))              |
  −
|                                                                    |
  −
| (cross_ribbed ,(faille ),(grosgrain ),(bengaline ),(ottoman ))      |
  −
|                                                                    |
  −
| (plain_weave  crossed_warps ,(leno_weave  ))                        |
  −
| (leno_weave  open          ,(marquisette ))                        |
  −
| (plain_weave  loop_threads  ,(pile_weave ))                        |
  −
|                                                                    |
  −
| (pile_weave ,(fill_pile ),(warp_pile ))                            |
  −
| (pile_weave ,(cut ),(uncut ))                                      |
  −
|                                                                    |
  −
| (pile_weave  warp_pile  cut                  ,(velvet    ))        |
  −
| (pile_weave  fill_pile  cut    aligned_pile  ,(corduroy  ))        |
  −
| (pile_weave  fill_pile  cut    staggered_pile ,(velveteen ))        |
  −
| (pile_weave  fill_pile  uncut  reversible    ,(terry    ))        |
  −
|                                                                    |
  −
| (pile_weave  fill_pile  cut ( (aligned_pile , staggered_pile ) ))  |
  −
|                                                                    |
  −
| (pile_weave ,(velvet ),(corduroy ),(velveteen ),(terry ))          |
  −
|                                                                    |
  −
| (plain_weave ,                                                      |
  −
|  (percale    ),(organdy    ),(organza    ),(plaid  ),            |
  −
|  (oxford    ),(monks_cloth ),(hopsacking ),                        |
  −
|  (faille    ),(grosgrain  ),(bengaline  ),(ottoman ),            |
  −
|  (leno_weave ),(pile_weave  ),(plain_weave  flannel ))            |
  −
|                                                                    |
  −
| (twill_weave ,                                                      |
  −
|  (warp_faced ),                                                    |
  −
|  (filling_faced ),                                                  |
  −
|  (even_twill ),                                                    |
  −
|  (twill_weave  flannel ))                                          |
  −
|                                                                    |
  −
| (twill_weave  warp_faced  colored_warp  white_fill ,(denim ))      |
  −
| (twill_weave  warp_faced  one_color                ,(drill ))      |
  −
| (twill_weave  even_twill  diagonal_rib            ,(serge ))      |
  −
|                                                                    |
  −
| (twill_weave  warp_faced (                                          |
  −
|  (one_color ,                                                      |
  −
|  ((colored_warp )(white_fill )) )                                  |
  −
| ))                                                                  |
  −
|                                                                    |
  −
| (twill_weave  warp_faced ,(denim ),(drill ))                        |
  −
| (twill_weave  even_twill ,(serge ))                                |
  −
|                                                                    |
  −
| ((                                                                  |
  −
|    (  ((plain_weave )(twill_weave ))                              |
  −
|        ((cotton      )(wool        )) napped ,(flannel ))          |
  −
| ))                                                                  |
  −
|                                                                    |
  −
| (satin_weave ,(warp_floats ),(fill_floats ))                        |
  −
|                                                                    |
  −
| (satin_weave ,(satin_weave smooth ),(satin_weave napped ))          |
  −
| (satin_weave ,(satin_weave cotton ),(satin_weave silk  ))          |
  −
|                                                                    |
  −
| (satin_weave  warp_floats  smooth        ,(satin    ))            |
  −
| (satin_weave  fill_floats  smooth        ,(sateen  ))            |
  −
| (satin_weave              napped  cotton ,(moleskin ))            |
  −
|                                                                    |
  −
| (satin_weave ,(satin ),(sateen ),(moleskin ))                      |
  −
|                                                                    |
  −
o---------------------------------------------------------------------o
  −
 
  −
| Reference [MaW]
  −
|
  −
| Maier, David & Warren, David S.,
  −
|'Computing with Logic:  Logic Programming with Prolog',
  −
| Benjamin/Cummings, Menlo Park, CA, 1988.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
I think that it might be a good idea to go back to a simpler example
  −
of a constraint satisfaction problem, and to discuss the elements of
  −
its expression as a ZOT in a less cluttered setting before advancing
  −
onward once again to problems on the order of the Four Houses Puzzle.
  −
 
  −
| Applications of a Propositional Calculator:
  −
| Constraint Satisfaction Problems.
  −
| Jon Awbrey, April 24, 1995.
  −
 
  −
Graph Coloring
  −
 
  −
Based on the discussion in [Wil, page 196].
  −
 
  −
One is given three colors, say, orange, silver, indigo,
  −
and a graph on four nodes that has the following shape:
  −
 
  −
|          1
  −
|          o
  −
|        / \
  −
|        /  \
  −
|    4 o-----o 2
  −
|        \  /
  −
|        \ /
  −
|          o
  −
|          3
  −
 
  −
The problem is to color the nodes of the graph
  −
in such a way that no pair of nodes that are
  −
adjacent in the graph, that is, linked by
  −
an edge, get the same color.
  −
 
  −
The objective situation that is to be achieved can be represented
  −
in a so-called "declarative" fashion, in effect, by employing the
  −
cactus language as a very simple sort of declarative programming
  −
language, and depicting the prospective solution to the problem
  −
as a ZOT.
  −
 
  −
To do this, begin by declaring the following set of
  −
twelve boolean variables or "zeroth order features":
  −
 
  −
{1_orange, 1_silver, 1_indigo,
  −
2_orange, 2_silver, 2_indigo,
  −
3_orange, 3_silver, 3_indigo,
  −
4_orange, 4_silver, 4_indigo}
  −
 
  −
The interpretation to keep in mind will be such that
  −
the feature name of the form "<node i>_<color j>"
  −
says that the node i is assigned the color j.
  −
 
  −
Logical Input File:  Color.Log
  −
o----------------------------------------------------------------------o
  −
|                                                                      |
  −
|  (( 1_orange ),( 1_silver ),( 1_indigo ))                            |
  −
|  (( 2_orange ),( 2_silver ),( 2_indigo ))                            |
  −
|  (( 3_orange ),( 3_silver ),( 3_indigo ))                            |
  −
|  (( 4_orange ),( 4_silver ),( 4_indigo ))                            |
  −
|                                                                      |
  −
|  ( 1_orange  2_orange )( 1_silver  2_silver )( 1_indigo  2_indigo )  |
  −
|  ( 1_orange  4_orange )( 1_silver  4_silver )( 1_indigo  4_indigo )  |
  −
|  ( 2_orange  3_orange )( 2_silver  3_silver )( 2_indigo  3_indigo )  |
  −
|  ( 2_orange  4_orange )( 2_silver  4_silver )( 2_indigo  4_indigo )  |
  −
|  ( 3_orange  4_orange )( 3_silver  4_silver )( 3_indigo  4_indigo )  |
  −
|                                                                      |
  −
o----------------------------------------------------------------------o
  −
 
  −
The first stanza of verses declares that
  −
every node is assigned exactly one color.
  −
 
  −
The second stanza of verses declares that
  −
no adjacent nodes get the very same color.
  −
 
  −
Each satisfying interpretation of this ZOT
  −
that is also a program corresponds to what
  −
graffitists call a "coloring" of the graph.
  −
 
  −
Theme One's Model interpreter, when we set
  −
it to work on this ZOT, will array  before
  −
our eyes all of the colorings of the graph.
  −
 
  −
Sense Outline:  Color.Sen
  −
o-----------------------------o
  −
| 1_orange                    |
  −
|  2_silver                  |
  −
|  3_orange                  |
  −
|    4_indigo                |
  −
|  2_indigo                  |
  −
|  3_orange                  |
  −
|    4_silver                |
  −
| 1_silver                    |
  −
|  2_orange                  |
  −
|  3_silver                  |
  −
|    4_indigo                |
  −
|  2_indigo                  |
  −
|  3_silver                  |
  −
|    4_orange                |
  −
| 1_indigo                    |
  −
|  2_orange                  |
  −
|  3_indigo                  |
  −
|    4_silver                |
  −
|  2_silver                  |
  −
|  3_indigo                  |
  −
|    4_orange                |
  −
o-----------------------------o
  −
 
  −
| Reference [Wil]
  −
|
  −
| Wilf, Herbert S.,
  −
|'Algorithms and Complexity',
  −
| Prentice-Hall, Englewood Cliffs, NJ, 1986.
  −
|
  −
| Nota Bene.  There is a wrong Figure in some
  −
| printings of the book, that does not match
  −
| the description of the Example that is
  −
| given in the text.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Let us continue to examine the properties of the cactus language
  −
as a minimal style of declarative programming language.  Even in
  −
the likes of this zeroth order microcosm one can observe, and on
  −
a good day still more clearly for the lack of other distractions,
  −
many of the buzz worlds that will spring into full bloom, almost
  −
as if from nowhere, to become the first order of business in the
  −
latter day logical organa, plus combinators, plus lambda calculi.
  −
 
  −
By way of homage to the classics of the art, I can hardly pass
  −
this way without paying my dues to the next sample of examples.
  −
 
  −
N Queens Problem
  −
 
  −
I will give the ZOT that describes the N Queens Problem for N = 5,
  −
since that is the most that I and my old 286 could do when last I
  −
wrote up this Example.
  −
 
  −
The problem is now to write a "zeroth order program" (ZOP) that
  −
describes the following objective:  To place 5 chess queens on
  −
a 5 by 5 chessboard so that no queen attacks any other queen.
  −
 
  −
It is clear that there can be at most one queen on each row
  −
of the board and so by dint of regal necessity, exactly one
  −
queen in each row of the desired array.  This gambit allows
  −
us to reduce the problem to one of picking a permutation of
  −
five things in fives places, and this affords us sufficient
  −
clue to begin down a likely path toward the intended object,
  −
by recruiting the following phalanx of 25 logical variables:
  −
 
  −
Literal Input File:  Q5.Lit
  −
o---------------------------------------o
  −
|                                      |
  −
|  q1_r1, q1_r2, q1_r3, q1_r4, q1_r5,  |
  −
|  q2_r1, q2_r2, q2_r3, q2_r4, q2_r5,  |
  −
|  q3_r1, q3_r2, q3_r3, q3_r4, q3_r5,  |
  −
|  q4_r1, q4_r2, q4_r3, q4_r4, q4_r5,  |
  −
|  q5_r1, q5_r2, q5_r3, q5_r4, q5_r5.  |
  −
|                                      |
  −
o---------------------------------------o
  −
 
  −
Thus we seek to define a function, of abstract type f : %B%^25 -> %B%,
  −
whose fibre of truth f^(-1)(%1%) is a set of interpretations, each of
  −
whose elements bears the abstract type of a point in the space %B%^25,
  −
and whose reading will inform us of our desired set of configurations.
  −
 
  −
Logical Input File:  Q5.Log
  −
o------------------------------------------------------------o
  −
|                                                            |
  −
|  ((q1_r1 ),(q1_r2 ),(q1_r3 ),(q1_r4 ),(q1_r5 ))            |
  −
|  ((q2_r1 ),(q2_r2 ),(q2_r3 ),(q2_r4 ),(q2_r5 ))            |
  −
|  ((q3_r1 ),(q3_r2 ),(q3_r3 ),(q3_r4 ),(q3_r5 ))            |
  −
|  ((q4_r1 ),(q4_r2 ),(q4_r3 ),(q4_r4 ),(q4_r5 ))            |
  −
|  ((q5_r1 ),(q5_r2 ),(q5_r3 ),(q5_r4 ),(q5_r5 ))            |
  −
|                                                            |
  −
|  ((q1_r1 ),(q2_r1 ),(q3_r1 ),(q4_r1 ),(q5_r1 ))            |
  −
|  ((q1_r2 ),(q2_r2 ),(q3_r2 ),(q4_r2 ),(q5_r2 ))            |
  −
|  ((q1_r3 ),(q2_r3 ),(q3_r3 ),(q4_r3 ),(q5_r3 ))            |
  −
|  ((q1_r4 ),(q2_r4 ),(q3_r4 ),(q4_r4 ),(q5_r4 ))            |
  −
|  ((q1_r5 ),(q2_r5 ),(q3_r5 ),(q4_r5 ),(q5_r5 ))            |
  −
|                                                            |
  −
|  ((                                                        |
  −
|                                                            |
  −
|  (q1_r1 q2_r2 )(q1_r1 q3_r3 )(q1_r1 q4_r4 )(q1_r1 q5_r5 )  |
  −
|                (q2_r2 q3_r3 )(q2_r2 q4_r4 )(q2_r2 q5_r5 )  |
  −
|                              (q3_r3 q4_r4 )(q3_r3 q5_r5 )  |
  −
|                                            (q4_r4 q5_r5 )  |
  −
|                                                            |
  −
|  (q1_r2 q2_r3 )(q1_r2 q3_r4 )(q1_r2 q4_r5 )                |
  −
|                (q2_r3 q3_r4 )(q2_r3 q4_r5 )                |
  −
|                              (q3_r4 q4_r5 )                |
  −
|                                                            |
  −
|  (q1_r3 q2_r4 )(q1_r3 q3_r5 )                              |
  −
|                (q2_r4 q3_r5 )                              |
  −
|                                                            |
  −
|  (q1_r4 q2_r5 )                                            |
  −
|                                                            |
  −
|  (q2_r1 q3_r2 )(q2_r1 q4_r3 )(q2_r1 q5_r4 )                |
  −
|                (q3_r2 q4_r3 )(q3_r2 q5_r4 )                |
  −
|                              (q4_r3 q5_r4 )                |
  −
|                                                            |
  −
|  (q3_r1 q4_r2 )(q3_r1 q5_r3 )                              |
  −
|                (q4_r2 q5_r3 )                              |
  −
|                                                            |
  −
|  (q4_r1 q5_r2 )                                            |
  −
|                                                            |
  −
|  (q1_r5 q2_r4 )(q1_r5 q3_r3 )(q1_r5 q4_r2 )(q1_r5 q5_r1 )  |
  −
|                (q2_r4 q3_r3 )(q2_r4 q4_r2 )(q2_r4 q5_r1 )  |
  −
|                              (q3_r3 q4_r2 )(q3_r3 q5_r1 )  |
  −
|                                            (q4_r2 q5_r1 )  |
  −
|                                                            |
  −
|  (q2_r5 q3_r4 )(q2_r5 q4_r3 )(q2_r5 q5_r2 )                |
  −
|                (q3_r4 q4_r3 )(q3_r4 q5_r2 )                |
  −
|                              (q4_r3 q5_r2 )                |
  −
|                                                            |
  −
|  (q3_r5 q4_r4 )(q3_r5 q5_r3 )                              |
  −
|                (q4_r4 q5_r3 )                              |
  −
|                                                            |
  −
|  (q4_r5 q5_r4 )                                            |
  −
|                                                            |
  −
|  (q1_r4 q2_r3 )(q1_r4 q3_r2 )(q1_r4 q4_r1 )                |
  −
|                (q2_r3 q3_r2 )(q2_r3 q4_r1 )                |
  −
|                              (q3_r2 q4_r1 )                |
  −
|                                                            |
  −
|  (q1_r3 q2_r2 )(q1_r3 q3_r1 )                              |
  −
|                (q2_r2 q3_r1 )                              |
  −
|                                                            |
  −
|  (q1_r2 q2_r1 )                                            |
  −
|                                                            |
  −
|  ))                                                        |
  −
|                                                            |
  −
o------------------------------------------------------------o
  −
 
  −
The vanguard of this logical regiment consists of two
  −
stock'a'block platoons, the pattern of whose features
  −
is the usual sort of array for conveying permutations.
  −
Between the stations of their respective offices they
  −
serve to warrant that all of the interpretations that
  −
are left standing on the field of valor at the end of
  −
the day will be ones that tell of permutations 5 by 5.
  −
The rest of the ruck and the runt of the mill in this
  −
regimental logos are there to cover the diagonal bias
  −
against attacking queens that is our protocol to suit.
  −
 
  −
And here is the issue of the day:
  −
 
  −
Sense Output:  Q5.Sen
  −
o-------------------o
  −
| q1_r1            |
  −
|  q2_r3            |
  −
|  q3_r5          |
  −
|    q4_r2          |
  −
|    q5_r4        | <1>
  −
|  q2_r4            |
  −
|  q3_r2          |
  −
|    q4_r5          |
  −
|    q5_r3        | <2>
  −
| q1_r2            |
  −
|  q2_r4            |
  −
|  q3_r1          |
  −
|    q4_r3          |
  −
|    q5_r5        | <3>
  −
|  q2_r5            |
  −
|  q3_r3          |
  −
|    q4_r1          |
  −
|    q5_r4        | <4>
  −
| q1_r3            |
  −
|  q2_r1            |
  −
|  q3_r4          |
  −
|    q4_r2          |
  −
|    q5_r5        | <5>
  −
|  q2_r5            |
  −
|  q3_r2          |
  −
|    q4_r4          |
  −
|    q5_r1        | <6>
  −
| q1_r4            |
  −
|  q2_r1            |
  −
|  q3_r3          |
  −
|    q4_r5          |
  −
|    q5_r2        | <7>
  −
|  q2_r2            |
  −
|  q3_r5          |
  −
|    q4_r3          |
  −
|    q5_r1        | <8>
  −
| q1_r5            |
  −
|  q2_r2            |
  −
|  q3_r4          |
  −
|    q4_r1          |
  −
|    q5_r3        | <9>
  −
|  q2_r3            |
  −
|  q3_r1          |
  −
|    q4_r4          |
  −
|    q5_r2        | <A>
  −
o-------------------o
  −
 
  −
The number at least checks with all of the best authorities,
  −
so I can breathe a sigh of relief on that account, at least.
  −
I am sure that there just has to be a more clever way to do
  −
this, that is to say, within the bounds of ZOT reason alone,
  −
but the above is the best that I could figure out with the
  −
time that I had at the time.
  −
 
  −
References:  [BaC, 166], [VaH, 122], [Wir, 143].
  −
 
  −
[BaC]  Ball, W.W. Rouse, & Coxeter, H.S.M.,
  −
      'Mathematical Recreations and Essays',
  −
      13th ed., Dover, New York, NY, 1987.
  −
 
  −
[VaH]  Van Hentenryck, Pascal,
  −
      'Constraint Satisfaction in Logic Programming,
  −
      MIT Press, Cambridge, MA, 1989.
  −
 
  −
[Wir]  Wirth, Niklaus,
  −
      'Algorithms + Data Structures = Programs',
  −
      Prentice-Hall, Englewood Cliffs, NJ, 1976.
  −
 
  −
http://mathworld.wolfram.com/QueensProblem.html
  −
http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
I turn now to another golden oldie of a constraint satisfaction problem
  −
that I would like to give here a slightly new spin, but not so much for
  −
the sake of these trifling novelties as from a sense of old time's ache
  −
and a duty to -- well, what's the opposite of novelty?
  −
 
  −
Phobic Apollo
  −
 
  −
| Suppose Peter, Paul, and Jane are musicians.  One of them plays
  −
| saxophone, another plays guitar, and the third plays drums.  As
  −
| it happens, one of them is afraid of things associated with the
  −
| number 13, another of them is afraid of cats, and the third is
  −
| afraid of heights.  You also know that Peter and the guitarist
  −
| skydive, that Paul and the saxophone player enjoy cats, and
  −
| that the drummer lives in apartment 13 on the 13th floor.
  −
|
  −
| Soon we will want to use these facts to reason
  −
| about whether or not certain identity relations
  −
| hold or are excluded.  Assume X(Peter, Guitarist)
  −
| means "the person who is Peter is not the person who
  −
| plays the guitar".  In this notation, the facts become:
  −
|
  −
| 1.  X(Peter, Guitarist)
  −
| 2.  X(Peter, Fears Heights)
  −
| 3.  X(Guitarist, Fears Heights)
  −
| 4.  X(Paul, Fears Cats)
  −
| 5.  X(Paul, Saxophonist)
  −
| 6.  X(Saxophonist, Fears Cats)
  −
| 7.  X(Drummer, Fears 13)
  −
| 8.  X(Drummer, Fears Heights)
  −
|
  −
| Exercise attributed to Kenneth D. Forbus, pages 449-450 in:
  −
| Patrick Henry Winston, 'Artificial Intelligence', 2nd ed.,
  −
| Addison-Wesley, Reading, MA, 1984.
  −
 
  −
Here is one way to represent these facts in the form of a ZOT
  −
and use it as a logical program to draw a succinct conclusion:
  −
 
  −
Logical Input File:  ConSat.Log
  −
o-----------------------------------------------------------------------o
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
  −
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
  −
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
  −
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
  −
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
  −
|                                                                      |
  −
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
  −
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
  −
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
  −
|                                                                      |
  −
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
  −
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
  −
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
  −
|                                                                      |
  −
|  ((                                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_guitar )                                                |
  −
|  ( pete_fears_height )                                                |
  −
|                                                                      |
  −
|  ( pete_plays_guitar  pete_fears_height )                            |
  −
|  ( paul_plays_guitar  paul_fears_height )                            |
  −
|  ( jane_plays_guitar  jane_fears_height )                            |
  −
|                                                                      |
  −
|  ( paul_fears_cats )                                                  |
  −
|  ( paul_plays_sax  )                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_sax  pete_fears_cats )                                  |
  −
|  ( paul_plays_sax  paul_fears_cats )                                  |
  −
|  ( jane_plays_sax  jane_fears_cats )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_13 )                                  |
  −
|  ( paul_plays_drums  paul_fears_13 )                                  |
  −
|  ( jane_plays_drums  jane_fears_13 )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_height )                              |
  −
|  ( paul_plays_drums  paul_fears_height )                              |
  −
|  ( jane_plays_drums  jane_fears_height )                              |
  −
|                                                                      |
  −
|  ))                                                                  |
  −
|                                                                      |
  −
o-----------------------------------------------------------------------o
  −
 
  −
Sense Outline:  ConSat.Sen
  −
o-----------------------------o
  −
| pete_plays_drums            |
  −
|  paul_plays_guitar          |
  −
|  jane_plays_sax            |
  −
|    pete_fears_cats          |
  −
|    paul_fears_13          |
  −
|      jane_fears_height      |
  −
o-----------------------------o
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Phobic Apollo (cont.)
  −
 
  −
It might be instructive to review various aspects
  −
of how the Theme One Study function actually went
  −
about arriving at its answer to that last problem.
  −
Just to prove that my program and I really did do
  −
our homework on that Phobic Apollo ConSat problem,
  −
and didn't just provoke some Oracle or other data
  −
base server to give it away, here is the middling
  −
output of the Model function as run on ConSat.Log:
  −
 
  −
Model Outline:  ConSat.Mod
  −
o-------------------------------------------------o
  −
| pete_plays_guitar -                            |
  −
| (pete_plays_guitar )                            |
  −
|  pete_plays_sax                                |
  −
|  pete_plays_drums -                            |
  −
|  (pete_plays_drums )                          |
  −
|    paul_plays_sax -                            |
  −
|    (paul_plays_sax )                            |
  −
|    jane_plays_sax -                            |
  −
|    (jane_plays_sax )                          |
  −
|      paul_plays_guitar                          |
  −
|      paul_plays_drums -                        |
  −
|      (paul_plays_drums )                      |
  −
|        jane_plays_guitar -                      |
  −
|        (jane_plays_guitar )                    |
  −
|        jane_plays_drums                        |
  −
|          pete_fears_13                          |
  −
|          pete_fears_cats -                    |
  −
|          (pete_fears_cats )                    |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height )                |
  −
|            paul_fears_13 -                    |
  −
|            (paul_fears_13 )                    |
  −
|              jane_fears_13 -                    |
  −
|              (jane_fears_13 )                  |
  −
|              paul_fears_cats -                |
  −
|              (paul_fears_cats )                |
  −
|                paul_fears_height -              |
  −
|                (paul_fears_height ) -          |
  −
|          (pete_fears_13 )                      |
  −
|          pete_fears_cats -                    |
  −
|          (pete_fears_cats )                    |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height ) -              |
  −
|        (jane_plays_drums ) -                  |
  −
|      (paul_plays_guitar )                      |
  −
|      paul_plays_drums                          |
  −
|        jane_plays_drums -                      |
  −
|        (jane_plays_drums )                      |
  −
|        jane_plays_guitar                      |
  −
|          pete_fears_13                          |
  −
|          pete_fears_cats -                    |
  −
|          (pete_fears_cats )                    |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height )                |
  −
|            paul_fears_13 -                    |
  −
|            (paul_fears_13 )                    |
  −
|              jane_fears_13 -                    |
  −
|              (jane_fears_13 )                  |
  −
|              paul_fears_cats -                |
  −
|              (paul_fears_cats )                |
  −
|                paul_fears_height -              |
  −
|                (paul_fears_height ) -          |
  −
|          (pete_fears_13 )                      |
  −
|          pete_fears_cats -                    |
  −
|          (pete_fears_cats )                    |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height ) -              |
  −
|        (jane_plays_guitar ) -                  |
  −
|      (paul_plays_drums ) -                    |
  −
|  (pete_plays_sax )                              |
  −
|  pete_plays_drums                              |
  −
|    paul_plays_drums -                          |
  −
|    (paul_plays_drums )                          |
  −
|    jane_plays_drums -                          |
  −
|    (jane_plays_drums )                        |
  −
|      paul_plays_guitar                          |
  −
|      paul_plays_sax -                          |
  −
|      (paul_plays_sax )                        |
  −
|        jane_plays_guitar -                      |
  −
|        (jane_plays_guitar )                    |
  −
|        jane_plays_sax                          |
  −
|          pete_fears_13 -                        |
  −
|          (pete_fears_13 )                      |
  −
|          pete_fears_cats                      |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height )                |
  −
|            paul_fears_cats -                  |
  −
|            (paul_fears_cats )                  |
  −
|              jane_fears_cats -                  |
  −
|              (jane_fears_cats )                |
  −
|              paul_fears_13                    |
  −
|                paul_fears_height -              |
  −
|                (paul_fears_height )            |
  −
|                jane_fears_13 -                |
  −
|                (jane_fears_13 )                |
  −
|                  jane_fears_height *            |
  −
|                  (jane_fears_height ) -        |
  −
|              (paul_fears_13 )                  |
  −
|                paul_fears_height -              |
  −
|                (paul_fears_height ) -          |
  −
|          (pete_fears_cats )                    |
  −
|            pete_fears_height -                  |
  −
|            (pete_fears_height ) -              |
  −
|        (jane_plays_sax ) -                    |
  −
|      (paul_plays_guitar )                      |
  −
|      paul_plays_sax -                          |
  −
|      (paul_plays_sax ) -                      |
  −
|  (pete_plays_drums ) -                        |
  −
o-------------------------------------------------o
  −
 
  −
This is just the traverse of the "arboreal boolean expansion" (ABE) tree
  −
that Model function germinates from the propositional expression that we
  −
planted in the file Consat.Log, which works to describe the facts of the
  −
situation in question.  Since there are 18 logical feature names in this
  −
propositional expression, we are literally talking about a function that
  −
enjoys the abstract type f : %B%^18 -> %B%.  If I had wanted to evaluate
  −
this function by expressly writing out its truth table, then it would've
  −
required 2^18 = 262144 rows.  Now I didn't bother to count, but I'm sure
  −
that the above output does not have anywhere near that many lines, so it
  −
must be that my program, and maybe even its author, has done a couple of
  −
things along the way that are moderately intelligent.  At least, we hope.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
AK = Antti Karttunen
  −
JA = Jon Awbrey
  −
 
  −
AK: Am I (and other SeqFanaticians) missing something from this thread?
  −
 
  −
AK: Your previous message on seqfan (headers below) is a bit of the same topic,
  −
    but does it belong to the same thread?  Where I could obtain the other
  −
    messages belonging to those two threads?  (I'm just now starting to
  −
    study "mathematical logic", and its relations to combinatorics are
  −
    very interesting.)  Is this "cactus" language documented anywhere?
  −
 
  −
here i was just following a courtesy of copying people
  −
when i reference their works, in this case neil's site:
  −
 
  −
http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170
  −
 
  −
but then i thought that the seqfantasians might be amused, too.
  −
 
  −
the bit on higher order propositions, in particular,
  −
those of type h : (B^2 -> B) -> B, i sent because
  −
of the significance that 2^2^2^2 = 65536 took on
  −
for us around that time.  & the ho, ho, ho joke.
  −
 
  −
"zeroth order logic" (zol) is just another name for
  −
the propositional calculus or the sentential logic
  −
that comes before "first order logic" (fol), aka
  −
first intens/tional logic, quantificational logic,
  −
or predicate calculus, depending on who you talk to.
  −
 
  −
the line of work that i have been doing derives from
  −
the ideas of c.s. peirce (1839-1914), who developed
  −
a couple of systems of "logical graphs", actually,
  −
two variant interpretations of the same abstract
  −
structures, called "entitative" and "existential"
  −
graphs.  he organized his system into "alpha",
  −
"beta", and "gamma" layers, roughly equivalent
  −
to our propositional, quantificational, and
  −
modal levels of logic today.
  −
 
  −
on the more contemporary scene, peirce's entitative interpretation
  −
of logical graphs was revived and extended by george spencer brown
  −
in his book 'laws of form', while the existential interpretation
  −
has flourished in the development of "conceptual graphs" by
  −
john f sowa and a community of growing multitudes.
  −
 
  −
a passel of links:
  −
 
  −
http://members.door.net/arisbe/
  −
http://www.enolagaia.com/GSB.html
  −
http://www.cs.uah.edu/~delugach/CG/
  −
http://www.jfsowa.com/
  −
http://www.jfsowa.com/cg/
  −
http://www.jfsowa.com/peirce/ms514w.htm
  −
http://users.bestweb.net/~sowa/
  −
http://users.bestweb.net/~sowa/peirce/ms514.htm
  −
 
  −
i have mostly focused on "alpha" (prop calc or zol) --
  −
though the "func conception of quant logic" thread was
  −
a beginning try at saying how the same line of thought
  −
might be extended to 1st, 2nd, & higher order logics --
  −
and i devised a particular graph & string syntax that
  −
is based on a species of cacti, officially described as
  −
the "reflective extension of logical graphs" (ref log),
  −
but more lately just referred to as "cactus language".
  −
 
  −
it turns out that one can do many interesting things
  −
with prop calc if one has an efficient enough syntax
  −
and a powerful enough interpreter for it, even using
  −
it as a very minimal sort of declarative programming
  −
language, hence, the current thread was directed to
  −
applying "zeroth order theories" (zot's) as brands
  −
of "zeroth order programs" (zop's) to a set of old
  −
constraint satisfaction and knowledge rep examples.
  −
 
  −
more recent expositions of the cactus language have been directed
  −
toward what some people call "ontology engineering" -- it sounds
  −
so much cooler than "taxonomy" -- and so these are found in the
  −
ieee standard upper ontology working group discussion archives.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Let's now pause and reflect on the mix of abstract and concrete material
  −
that we have cobbled together in spectacle of this "World Of Zero" (WOZ),
  −
since I believe that we may have seen enough, if we look at it right, to
  −
illustrate a few of the more salient phenomena that would normally begin
  −
to weigh in as a major force only on a much larger scale.  Now, it's not
  −
exactly like this impoverished sample, all by itself, could determine us
  −
to draw just the right generalizations, or force us to see the shape and
  −
flow of its immanent law -- it is much too sparse a scattering of points
  −
to tease out the lines of its up and coming generations quite so clearly --
  −
but it can be seen to exemplify many of the more significant themes that
  −
we know evolve in more substantial environments, that is, On Beyond Zero,
  −
since we have already seen them, "tho' obscur'd", in these higher realms.
  −
 
  −
One the the themes that I want to to keep an eye on as this discussion
  −
develops is the subject that might be called "computation as semiosis".
  −
 
  −
In this light, any calculus worth its salt must be capable of helping
  −
us do two things, calculation, of course, but also analysis.  This is
  −
probably one of the reasons why the ordinary sort of differential and
  −
integral calculus over quantitative domains is frequently referred to
  −
as "real analysis", or even just "analysis".  It seems quite clear to
  −
me that any adequate logical calculus, in many ways expected to serve
  −
as a qualitative analogue of analytic geometry in the way that it can
  −
be used to describe configurations in logically circumscribed domains,
  −
ought to qualify in both dimensions, namely, analysis and computation.
  −
 
  −
With all of these various features of the situation in mind, then, we come
  −
to the point of viewing analysis and computation as just so many different
  −
kinds of "sign transformations in respect of pragmata" (STIROP's).  Taking
  −
this insight to heart, let us next work to assemble a comprehension of our
  −
concrete examples, set in the medium of the abstract calculi that allow us
  −
to express their qualitative patterns, that may hope to be an increment or
  −
two less inchoate than we have seen so far, and that may even permit us to
  −
catch the action of these fading fleeting sign transformations on the wing.
  −
 
  −
Here is how I picture our latest round of examples
  −
as filling out the framework of this investigation:
  −
 
  −
o-----------------------------o-----------------------------o
  −
|    Objective Framework    |  Interpretive Framework    |
  −
o-----------------------------o-----------------------------o
  −
|                                                          |
  −
|                              s_1 = Logue(o)      |      |
  −
|                              /                    |      |
  −
|                            /                      |      |
  −
|                            @                      |      |
  −
|                          ·  \                      |      |
  −
|                        ·    \                    |      |
  −
|                      ·        i_1 = Model(o)      v      |
  −
|                    ·          s_2 = Model(o)      |      |
  −
|                  ·          /                    |      |
  −
|                ·            /                      |      |
  −
|    Object = o · · · · · · @                      |      |
  −
|                ·            \                      |      |
  −
|                  ·          \                    |      |
  −
|                    ·          i_2 = Tenor(o)      v      |
  −
|                      ·        s_3 = Tenor(o)      |      |
  −
|                        ·    /                    |      |
  −
|                          ·  /                      |      |
  −
|                            @                      |      |
  −
|                            \                      |      |
  −
|                              \                    |      |
  −
|                              i_3 = Sense(o)      v      |
  −
|                                                          |
  −
o-----------------------------------------------------------o
  −
Figure.  Computation As Semiotic Transformation
  −
 
  −
The Figure shows three distinct sign triples of the form <o, s, i>, where
  −
o = ostensible objective = the observed, indicated, or intended situation.
  −
 
  −
| A.  <o, Logue(o), Model(o)>
  −
|
  −
| B.  <o, Model(o), Tenor(o)>
  −
|
  −
| C.  <o, Tenor(o), Sense(o)>
  −
 
  −
Let us bring these several signs together in one place,
  −
to compare and contrast their common and their diverse
  −
characters, and to think about why we make such a fuss
  −
about passing from one to the other in the first place.
  −
 
  −
1.  Logue(o)  =  Consat.Log
  −
o-----------------------------------------------------------------------o
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
  −
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
  −
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
  −
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
  −
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
  −
|                                                                      |
  −
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
  −
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
  −
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
  −
|                                                                      |
  −
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
  −
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
  −
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
  −
|                                                                      |
  −
|  ((                                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_guitar )                                                |
  −
|  ( pete_fears_height )                                                |
  −
|                                                                      |
  −
|  ( pete_plays_guitar  pete_fears_height )                            |
  −
|  ( paul_plays_guitar  paul_fears_height )                            |
  −
|  ( jane_plays_guitar  jane_fears_height )                            |
  −
|                                                                      |
  −
|  ( paul_fears_cats )                                                  |
  −
|  ( paul_plays_sax  )                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_sax  pete_fears_cats )                                  |
  −
|  ( paul_plays_sax  paul_fears_cats )                                  |
  −
|  ( jane_plays_sax  jane_fears_cats )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_13 )                                  |
  −
|  ( paul_plays_drums  paul_fears_13 )                                  |
  −
|  ( jane_plays_drums  jane_fears_13 )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_height )                              |
  −
|  ( paul_plays_drums  paul_fears_height )                              |
  −
|  ( jane_plays_drums  jane_fears_height )                              |
  −
|                                                                      |
  −
|  ))                                                                  |
  −
|                                                                      |
  −
o-----------------------------------------------------------------------o
  −
 
  −
2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html
  −
 
  −
3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
  −
o-------------------------------------------------o
  −
| (pete_plays_guitar )                            | <01> -
  −
|  (pete_plays_sax )                              | <02> -
  −
|  pete_plays_drums                              | <03> +
  −
|    (paul_plays_drums )                          | <04> -
  −
|    (jane_plays_drums )                        | <05> -
  −
|      paul_plays_guitar                          | <06> +
  −
|      (paul_plays_sax )                        | <07> -
  −
|        (jane_plays_guitar )                    | <08> -
  −
|        jane_plays_sax                          | <09> +
  −
|          (pete_fears_13 )                      | <10> -
  −
|          pete_fears_cats                      | <11> +
  −
|            (pete_fears_height )                | <12> -
  −
|            (paul_fears_cats )                  | <13> -
  −
|              (jane_fears_cats )                | <14> -
  −
|              paul_fears_13                    | <15> +
  −
|                (paul_fears_height )            | <16> -
  −
|                (jane_fears_13 )                | <17> -
  −
|                  jane_fears_height *            | <18> +
  −
o-------------------------------------------------o
  −
 
  −
4.  Sense(o)  =  Consat.Sen
  −
o-------------------------------------------------o
  −
| pete_plays_drums                                | <03>
  −
|  paul_plays_guitar                              | <06>
  −
|  jane_plays_sax                                | <09>
  −
|    pete_fears_cats                              | <11>
  −
|    paul_fears_13                              | <15>
  −
|      jane_fears_height                          | <18>
  −
o-------------------------------------------------o
  −
 
  −
As one proceeds through the subsessions of the Theme One Study session,
  −
the computation transforms its larger "signs", in this case text files,
  −
from one to the next, in the sequence:  Logue, Model, Tenor, and Sense.
  −
 
  −
Let us see if we can pin down, on sign-theoretic grounds,
  −
why this very sort of exercise is so routinely necessary.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
We were in the middle of pursuing several questions about
  −
sign relational transformations in general, in particular,
  −
the following Example of a sign transformation that arose
  −
in the process of setting up and solving a classical sort
  −
of constraint satisfaction problem.
  −
 
  −
o-----------------------------o-----------------------------o
  −
|    Objective Framework    |  Interpretive Framework    |
  −
o-----------------------------o-----------------------------o
  −
|                                                          |
  −
|                              s_1 = Logue(o)      |      |
  −
|                              /                    |      |
  −
|                            /                      |      |
  −
|                            @                      |      |
  −
|                          ·  \                      |      |
  −
|                        ·    \                    |      |
  −
|                      ·        i_1 = Model(o)      v      |
  −
|                    ·          s_2 = Model(o)      |      |
  −
|                  ·          /                    |      |
  −
|                ·            /                      |      |
  −
|    Object = o · · · · · · @                      |      |
  −
|                ·            \                      |      |
  −
|                  ·          \                    |      |
  −
|                    ·          i_2 = Tenor(o)      v      |
  −
|                      ·        s_3 = Tenor(o)      |      |
  −
|                        ·    /                    |      |
  −
|                          ·  /                      |      |
  −
|                            @                      |      |
  −
|                            \                      |      |
  −
|                              \                    |      |
  −
|                              i_3 = Sense(o)      v      |
  −
|                                                          |
  −
o-----------------------------------------------------------o
  −
Figure.  Computation As Semiotic Transformation
  −
 
  −
1.  Logue(o)  =  Consat.Log
  −
o-----------------------------------------------------------------------o
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
  −
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
  −
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
  −
|                                                                      |
  −
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
  −
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
  −
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
  −
|                                                                      |
  −
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
  −
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
  −
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
  −
|                                                                      |
  −
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
  −
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
  −
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
  −
|                                                                      |
  −
|  ((                                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_guitar )                                                |
  −
|  ( pete_fears_height )                                                |
  −
|                                                                      |
  −
|  ( pete_plays_guitar  pete_fears_height )                            |
  −
|  ( paul_plays_guitar  paul_fears_height )                            |
  −
|  ( jane_plays_guitar  jane_fears_height )                            |
  −
|                                                                      |
  −
|  ( paul_fears_cats )                                                  |
  −
|  ( paul_plays_sax  )                                                  |
  −
|                                                                      |
  −
|  ( pete_plays_sax  pete_fears_cats )                                  |
  −
|  ( paul_plays_sax  paul_fears_cats )                                  |
  −
|  ( jane_plays_sax  jane_fears_cats )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_13 )                                  |
  −
|  ( paul_plays_drums  paul_fears_13 )                                  |
  −
|  ( jane_plays_drums  jane_fears_13 )                                  |
  −
|                                                                      |
  −
|  ( pete_plays_drums  pete_fears_height )                              |
  −
|  ( paul_plays_drums  paul_fears_height )                              |
  −
|  ( jane_plays_drums  jane_fears_height )                              |
  −
|                                                                      |
  −
|  ))                                                                  |
  −
|                                                                      |
  −
o-----------------------------------------------------------------------o
  −
 
  −
2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html
  −
 
  −
3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
  −
o-------------------------------------------------o
  −
| (pete_plays_guitar )                            | <01> -
  −
|  (pete_plays_sax )                              | <02> -
  −
|  pete_plays_drums                              | <03> +
  −
|    (paul_plays_drums )                          | <04> -
  −
|    (jane_plays_drums )                        | <05> -
  −
|      paul_plays_guitar                          | <06> +
  −
|      (paul_plays_sax )                        | <07> -
  −
|        (jane_plays_guitar )                    | <08> -
  −
|        jane_plays_sax                          | <09> +
  −
|          (pete_fears_13 )                      | <10> -
  −
|          pete_fears_cats                      | <11> +
  −
|            (pete_fears_height )                | <12> -
  −
|            (paul_fears_cats )                  | <13> -
  −
|              (jane_fears_cats )                | <14> -
  −
|              paul_fears_13                    | <15> +
  −
|                (paul_fears_height )            | <16> -
  −
|                (jane_fears_13 )                | <17> -
  −
|                  jane_fears_height *            | <18> +
  −
o-------------------------------------------------o
  −
 
  −
4.  Sense(o)  =  Consat.Sen
  −
o-------------------------------------------------o
  −
| pete_plays_drums                                | <03>
  −
|  paul_plays_guitar                              | <06>
  −
|  jane_plays_sax                                | <09>
  −
|    pete_fears_cats                              | <11>
  −
|    paul_fears_13                              | <15>
  −
|      jane_fears_height                          | <18>
  −
o-------------------------------------------------o
  −
 
  −
We can worry later about the proper use of quotation marks
  −
in discussing such a case, where the file name "Yada.Yak"
  −
denotes a piece of text that expresses a proposition that
  −
describes an objective situation or an intentional object,
  −
but whatever the case it is clear that we are knee & neck
  −
deep in a sign relational situation of a modest complexity.
  −
 
  −
I think that the right sort of analogy might help us
  −
to sort it out, or at least to tell what's important
  −
from the things that are less so.  The paradigm that
  −
comes to mind for me is the type of context in maths
  −
where we talk about the "locus" or the "solution set"
  −
of an equation, and here we think of the equation as
  −
denoting its solution set or describing a locus, say,
  −
a point or a curve or a surface or so on up the scale.
  −
 
  −
In this figure of speech, we might say for instance:
  −
 
  −
| o is
  −
| what "x^3 - 3x^2 + 3x - 1 = 0" denotes is
  −
| what "(x-1)^3 = 0" denotes is
  −
| what "1" denotes
  −
| is 1.
  −
 
  −
Making explicit the assumptive interpretations
  −
that the context probably enfolds in this case,
  −
we assume this description of the solution set:
  −
 
  −
{x in the Reals : x^3 - 3x^2 + 3x -1 = 0} = {1}.
  −
 
  −
In sign relational terms, we have the 3-tuples:
  −
 
  −
| <o, "x^3 - 3x^2 + 3x - 1 = 0", "(x-1)^3 = 0">
  −
|
  −
| <o, "(x-1)^3 = 0", "1">
  −
|
  −
| <o, "1", "1">
  −
 
  −
As it turns out we discover that the
  −
object o was really just 1 all along.
  −
 
  −
But why do we put ourselves through the rigors of these
  −
transformations at all?  If 1 is what we mean, why not
  −
just say "1" in the first place and be done with it?
  −
A person who asks a question like that has forgetten
  −
how we keep getting ourselves into these quandaries,
  −
and who it is that assigns the problems, for it is
  −
Nature herself who is the taskmistress here and the
  −
problems are set in the manner that she determines,
  −
not in the style to which we would like to become
  −
accustomed.  The best that we can demand of our
  −
various and sundry calculi is that they afford
  −
us with the nets and the snares more readily
  −
to catch the shape of the problematic game
  −
as it flies up before us on its own wings,
  −
and only then to tame it to the amenable
  −
demeanors that we find to our liking.
  −
 
  −
In sum, the first place is not ours to take.
  −
We are but poor second players in this game.
  −
 
  −
That understood, I can now lay out our present Example
  −
along the lines of this familiar mathematical exercise.
  −
 
  −
| o is
  −
| what Consat.Log denotes is
  −
| what Consat.Mod denotes is
  −
| what Consat.Ten denotes is
  −
| what Consat.Sen denotes.
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
It will be good to keep this picture before us a while longer:
  −
 
  −
o-----------------------------o-----------------------------o
  −
|    Objective Framework    |  Interpretive Framework    |
  −
o-----------------------------o-----------------------------o
  −
|                                                          |
  −
|                              s_1  =  Logue(o)    |      |
  −
|                              /                    |      |
  −
|                            /                      |      |
  −
|                            @                      |      |
  −
|                          ·  \                      |      |
  −
|                        ·    \                    |      |
  −
|                      ·        i_1  =  Model(o)    v      |
  −
|                    ·          s_2  =  Model(o)    |      |
  −
|                  ·          /                    |      |
  −
|                ·            /                      |      |
  −
|  Object  =  o · · · · · · @                      |      |
  −
|                ·            \                      |      |
  −
|                  ·          \                    |      |
  −
|                    ·          i_2  =  Tenor(o)    v      |
  −
|                      ·        s_3  =  Tenor(o)    |      |
  −
|                        ·    /                    |      |
  −
|                          ·  /                      |      |
  −
|                            @                      |      |
  −
|                            \                      |      |
  −
|                              \                    |      |
  −
|                              i_3  =  Sense(o)    v      |
  −
|                                                          |
  −
o-----------------------------------------------------------o
  −
Figure.  Computation As Semiotic Transformation
  −
 
  −
The labels that decorate the syntactic plane and indicate
  −
the semiotic transitions in the interpretive panel of the
  −
framework point us to text files whose contents rest here:
  −
 
  −
http://suo.ieee.org/ontology/msg03722.html
  −
 
  −
The reason that I am troubling myself -- and no doubt you --
  −
with the details of this Example is because it highlights
  −
a number of the thistles that we will have to grasp if we
  −
ever want to escape from the traps of YARNBOL and YARWARS
  −
in which so many of our fairweather fiends are seeking to
  −
ensnare us, and not just us -- the whole web of the world.
  −
 
  −
YARNBOL  =  Yet Another Roman Numeral Based Ontology Language.
  −
YARWARS  =  Yet Another Representation Without A Reasoning System.
  −
 
  −
In order to avoid this, or to reverse the trend once it gets started,
  −
we just have to remember what a dynamic living process a computation
  −
really is, precisely because it is meant to serve as an iconic image
  −
of dynamic, deliberate, purposeful transformations that we are bound
  −
to go through and to carry out in a hopeful pursuit of the solutions
  −
to the many real live problems that life and society place before us.
  −
So I take it rather seriously.
  −
 
  −
Okay, back to the grindstone.
  −
 
  −
The question is:  "Why are these trips necessary?"
  −
 
  −
How come we don't just have one proper expression
  −
for each situation under the sun, or all possible
  −
suns, I guess, for some, and just use that on any
  −
appearance, instance, occasion of that situation?
  −
 
  −
Why is it ever necessary to begin with an obscure description
  −
of a situation? -- for that is exactly what the propositional
  −
expression caled "Logue(o)", for Example, the Consat.Log file,
  −
really is.
     −
Maybe I need to explain that first.
+
===Aug 2000 &bull; Extensions Of Logical Graphs===
   −
The first three items of syntax -- Logue(o), Model(o), Tenor(o) --
+
====CG List &bull; Lost Links====
are all just so many different propositional expressions that
  −
denote one and the same logical-valued function p : X -> %B%,
  −
and one whose abstract image we may well enough describe as
  −
a boolean function of the abstract type q : %B%^k -> %B%,
  −
where k happens to be 18 in the present Consat Example.
     −
If we were to write out the truth table for q : %B%^18 -> %B%
+
# http://www.virtual-earth.de/CG/cg-list/old/msg03351.html
it would take 2^18 = 262144 rows. Using the bold letter #x#
+
# http://www.virtual-earth.de/CG/cg-list/old/msg03352.html
for a coordinate tuple, writing #x# = <x_1, ..., x_18>, each
+
# http://www.virtual-earth.de/CG/cg-list/old/msg03353.html
row of the table would have the form <x_1, ..., x_18, q(#x#)>.
+
# http://www.virtual-earth.de/CG/cg-list/old/msg03354.html
And the function q is such that all rows evalue to %0% save 1.
+
# http://www.virtual-earth.de/CG/cg-list/old/msg03376.html
 +
# http://www.virtual-earth.de/CG/cg-list/old/msg03379.html
 +
# http://www.virtual-earth.de/CG/cg-list/old/msg03381.html
   −
Each of the four different formats expresses this fact about q
+
====CG List &bull; New Archive====
in its own way.  The first three are logically equivalent, and
  −
the last one is the maximally determinate positive implication
  −
of what the others all say.
     −
From this point of view, the logical computation that we went through,
+
* http://web.archive.org/web/20031104183832/http://mars.virtual-earth.de/pipermail/cg/2000q3/thread.html#3592
in the sequence Logue, Model, Tenor, Sense, was a process of changing
+
# http://web.archive.org/web/20030723202219/http://mars.virtual-earth.de/pipermail/cg/2000q3/003592.html
from an obscure sign of the objective proposition to a more organized
+
# http://web.archive.org/web/20030723202341/http://mars.virtual-earth.de/pipermail/cg/2000q3/003593.html
arrangement of its satisfying or unsatisfying interpretations, to the
+
# &bull;
most succinct possible expression of the same meaning, to an adequate
+
# http://web.archive.org/web/20030723202516/http://mars.virtual-earth.de/pipermail/cg/2000q3/003595.html
positive projection of it that is useful enough in the proper context.
+
# &bull;
 +
# &bull;
 +
# &bull;
   −
This is the sort of mill -- it's called "computation" -- that we have
+
====CG List &bull; Old Archive====
to be able to put our representations through on a recurrent, regular,
  −
routine basis, that is, if we expect them to have any utility at all.
  −
And it is only when we have started to do that in genuinely effective
  −
and efficient ways, that we can even begin to think about facilitating
  −
any bit of qualitative conceptual analysis through computational means.
     −
And as far as the qualitative side of logical computation
+
# &bull;
and conceptual analysis goes, we have barely even started.
+
# http://web.archive.org/web/20020321115639/http://www.virtual-earth.de/CG/cg-list/msg03352.html
 +
# &bull;
 +
# http://web.archive.org/web/20020321120331/http://www.virtual-earth.de/CG/cg-list/msg03354.html
 +
# http://web.archive.org/web/20020321223131/http://www.virtual-earth.de/CG/cg-list/msg03376.html
 +
# &bull;
 +
# http://web.archive.org/web/20020129134132/http://www.virtual-earth.de/CG/cg-list/msg03381.html
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
===Sep 2000 &bull; Zeroth Order Logic===
   −
We are contemplating the sequence of initial and normal forms
+
* http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/thrd241.html#01246
for the Consat problem and we have noted the following system
+
* http://web.archive.org/web/20130306202443/http://suo.ieee.org/email/thrd242.html#01406
of logical relations, taking the enchained expressions of the
  −
objective situation o in a pairwise associated way, of course:
     −
Logue(o)  <=>  Model(o)  <=>  Tenor(o)  =>  Sense(o).
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01246.html
 +
# http://web.archive.org/web/20080905054059/http://suo.ieee.org/email/msg01251.html
 +
# http://web.archive.org/web/20070223033521/http://suo.ieee.org/email/msg01380.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01406.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01546.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01561.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01670.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01966.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01985.html
 +
# http://web.archive.org/web/20070401102902/http://suo.ieee.org/email/msg01988.html
   −
The specifics of the propositional expressions are cited here:
+
===Oct 2000 &bull; All Liar, No Paradox===
   −
http://suo.ieee.org/ontology/msg03722.html
+
* http://web.archive.org/web/20130306202504/http://suo.ieee.org/email/thrd236.html#01739
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg01739.html
   −
If we continue to pursue the analogy that we made with the form
+
===Nov 2000 &bull; Sowa's Top Level Categories===
of mathematical activity commonly known as "solving equations",
  −
then there are many salient features of this type of logical
  −
problem solving endeavor that suddenly leap into the light.
     −
First of all, we notice the importance of "equational reasoning"
+
====What Language To Use====
in mathematics, by which I mean, not just the quantitative type
  −
of equation that forms the matter of the process, but also the
  −
qualitative type of equation, or the "logical equivalence",
  −
that connects each expression along the way, right up to
  −
the penultimate stage, when we are satisfied in a given
  −
context to take a projective implication of the total
  −
knowledge of the situation that we have been taking
  −
some pains to preserve at every intermediate stage
  −
of the game.
     −
This general pattern or strategy of inference, working its way through
+
* http://web.archive.org/web/20070218222218/http://suo.ieee.org/email/threads.html#01956
phases of "equational" or "total information preserving" inference and
+
# http://web.archive.org/web/20070320012929/http://suo.ieee.org/email/msg01956.html
phases of "implicational" or "selective information losing" inference,
  −
is actually very common throughout mathematics, and I have in mind to
  −
examine its character in greater detail and in a more general setting.
     −
Just as the barest hint of things to come along these lines, you might
+
====Zeroth Order Logic====
consider the question of what would constitute the equational analogue
  −
of modus ponens, in other words the scheme of inference that goes from
  −
x and x=>y to y.  Well the answer is a scheme of inference that passes
  −
from x and x=>y to x&y, and then being reversible, back again.  I will
  −
explore the rationale and the utility of this gambit in future reports.
     −
One observation that we can make already at this point,
+
* http://web.archive.org/web/20070218222218/http://suo.ieee.org/email/threads.html#01966
however, is that these schemes of equational reasoning,
+
# http://web.archive.org/web/20070320012940/http://suo.ieee.org/email/msg01966.html
or reversible inference, remain poorly developed among
  −
our currently prevailing styles of inference in logic,
  −
their potentials for applied logical software hardly
  −
being broached in our presently available systems.
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
====TLC In KIF====
   −
Extra Examples
+
* http://web.archive.org/web/20130304163442/http://suo.ieee.org/ontology/thrd110.html#00048
 +
# http://web.archive.org/web/20081204195421/http://suo.ieee.org/ontology/msg00048.html
 +
# http://web.archive.org/web/20070320014557/http://suo.ieee.org/ontology/msg00051.html
   −
1.  Propositional logic example.
+
===Dec 2000 &bull; Sequential Interactions Generating Hypotheses===
Files:  Alpha.lex + Prop.log
  −
Ref:    [Cha, 20, Example 2.12]
     −
2. Chemical synthesis problem.
+
* http://web.archive.org/web/20130306202621/http://suo.ieee.org/email/thrd217.html#02607
Files: Chem.*
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg02607.html
Ref:   [Cha, 21, Example 2.13]
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg02608.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg03183.html
   −
3.  N Queens problem.
+
===Jan 2001 &bull; Differential Analytic Turing Automata===
Files:  Queen*.*,  Q8.*,  Q5.*
  −
Refs:  [BaC, 166], [VaH, 122], [Wir, 143].
  −
Notes:  Only the 5 Queens example will run in 640K memory.
  −
        Use the "Queen.lex" file to load the "Q5.eg*" log files.
     −
4.  Five Houses puzzle.
+
====DATA &bull; Arisbe List====
Files:  House.*
  −
Ref:    [VaH, 132].
  −
Notes:  Will not run in 640K memory.
     −
5. Graph coloring example.
+
* http://web.archive.org/web/20150107163000/http://stderr.org/pipermail/arisbe/2001-January/thread.html#182
Files: Color.*
+
# http://web.archive.org/web/20061013224128/http://stderr.org/pipermail/arisbe/2001-January/000182.html
Ref:   [Wil, 196].
+
# http://web.archive.org/web/20061013224814/http://stderr.org/pipermail/arisbe/2001-January/000200.html
   −
6.  Examples of Cook's Theorem in computational complexity,
+
====DATA &bull; Ontology List====
    that propositional satisfiability is NP-complete.
     −
Files: StiltN.* = "Space and Time Limited Turing Machine",
+
* http://web.archive.org/web/20130304165332/http://suo.ieee.org/ontology/thrd95.html#00596
        with N units of space and N units of time.
+
# http://web.archive.org/web/20041021223934/http://suo.ieee.org/ontology/msg00596.html
        StuntN.* = "Space and Time Limited Turing Machine",
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg00618.html
        for computing the parity of a bit string,
  −
        with Number of Tape cells of input equal to N.
  −
Ref:   [Wil, 188-201].
  −
Notes: Can only run Turing machine example for input of size 2.
  −
        Since the last tape cell is used for an end-of-file marker,
  −
        this amounts to only one significant digit of computation.
  −
        Use the "Stilt3.lex" file  to load the "Stunt2.egN" files.
  −
        Their Sense file outputs appear on the "Stunt2.seN" files.
     −
7.  Fabric knowledge base.
+
===Mar 2001 &bull; Propositional Equation Reasoning Systems===
Files:  Fabric.*, Fab.*
  −
Ref:    [MaW, 8-16].
     −
8.  Constraint Satisfaction example.
+
====PERS &bull; Arisbe List====
Files:  Consat1.*, Consat2.*
  −
Ref:    [Win, 449, Exercise 3-9].
  −
Notes:  Attributed to Kenneth D. Forbus.
     −
References
+
* http://web.archive.org/web/20150107210802/http://stderr.org/pipermail/arisbe/2001-March/thread.html#380
 +
* http://web.archive.org/web/20150107212028/http://stderr.org/pipermail/arisbe/2001-April/thread.html#407
   −
| Angluin, Dana,
+
# http://web.archive.org/web/20150107210011/http://stderr.org/pipermail/arisbe/2001-March/000380.html
|"Learning with Hints", in
+
# http://web.archive.org/web/20050920031758/http://stderr.org/pipermail/arisbe/2001-April/000407.html
|'Proceedings of the 1988 Workshop on Computational Learning Theory',
+
# http://web.archive.org/web/20051202010243/http://stderr.org/pipermail/arisbe/2001-April/000409.html
| edited by D. Haussler & L. Pitt, Morgan Kaufmann, San Mateo, CA, 1989.
+
# http://web.archive.org/web/20051202074355/http://stderr.org/pipermail/arisbe/2001-April/000411.html
 +
# http://web.archive.org/web/20051202021217/http://stderr.org/pipermail/arisbe/2001-April/000412.html
 +
# http://web.archive.org/web/20051201225716/http://stderr.org/pipermail/arisbe/2001-April/000413.html
 +
# http://web.archive.org/web/20051202001736/http://stderr.org/pipermail/arisbe/2001-April/000416.html
 +
# http://web.archive.org/web/20051202053817/http://stderr.org/pipermail/arisbe/2001-April/000417.html
 +
# http://web.archive.org/web/20051202013458/http://stderr.org/pipermail/arisbe/2001-April/000421.html
 +
# http://web.archive.org/web/20051202013024/http://stderr.org/pipermail/arisbe/2001-April/000427.html
 +
# http://web.archive.org/web/20051202032812/http://stderr.org/pipermail/arisbe/2001-April/000428.html
 +
# http://web.archive.org/web/20051201225109/http://stderr.org/pipermail/arisbe/2001-April/000430.html
 +
# http://web.archive.org/web/20050908023250/http://stderr.org/pipermail/arisbe/2001-April/000432.html
 +
# http://web.archive.org/web/20051202002952/http://stderr.org/pipermail/arisbe/2001-April/000433.html
 +
# http://web.archive.org/web/20051201220336/http://stderr.org/pipermail/arisbe/2001-April/000434.html
 +
# http://web.archive.org/web/20050906215058/http://stderr.org/pipermail/arisbe/2001-April/000435.html
   −
| Ball, W.W. Rouse, & Coxeter, H.S.M.,
+
====PERS &bull; Arisbe List &bull; Discussion====
|'Mathematical Recreations and Essays', 13th ed.,
  −
| Dover, New York, NY, 1987.
     −
| Chang, Chin-Liang & Lee, Richard Char-Tung,
+
* http://web.archive.org/web/20150107212028/http://stderr.org/pipermail/arisbe/2001-April/thread.html#397
|'Symbolic Logic and Mechanical Theorem Proving',
+
# http://web.archive.org/web/20150107212003/http://stderr.org/pipermail/arisbe/2001-April/000397.html
| Academic Press, New York, NY, 1973.
     −
| Denning, Peter J., Dennis, Jack B., and Qualitz, Joseph E.,
+
====PERS &bull; Ontology List====
|'Machines, Languages, and Computation',
  −
| Prentice-Hall, Englewood Cliffs, NJ, 1978.
     −
| Edelman, Gerald M.,
+
* http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/thrd74.html#01779
|'Topobiology: An Introduction to Molecular Embryology',
+
# http://web.archive.org/web/20070326233418/http://suo.ieee.org/ontology/msg01779.html
| Basic Books, New York, NY, 1988.
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg01897.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02005.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02011.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02014.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02015.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02024.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02046.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02047.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02068.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02102.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02109.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02117.html
 +
# http://web.archive.org/web/20040116001230/http://suo.ieee.org/ontology/msg02125.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02128.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02134.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg02138.html
   −
| Lloyd, J.W.,
+
====PERS &bull; SUO List====
|'Foundations of Logic Programming',
  −
| Springer-Verlag, Berlin, 1984.
     −
| Maier, David & Warren, David S.,
+
* http://web.archive.org/web/20130109194711/http://suo.ieee.org/email/thrd187.html#04187
|'Computing with Logic: Logic Programming with Prolog',
+
# http://web.archive.org/web/20140423181000/http://suo.ieee.org/email/msg04187.html
| Benjamin/Cummings, Menlo Park, CA, 1988.
+
# http://web.archive.org/web/20070922193822/http://suo.ieee.org/email/msg04305.html
 +
# http://web.archive.org/web/20071007170752/http://suo.ieee.org/email/msg04413.html
 +
# http://web.archive.org/web/20070121063018/http://suo.ieee.org/email/msg04419.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04422.html
 +
# http://web.archive.org/web/20070305132316/http://suo.ieee.org/email/msg04423.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04432.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04454.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04455.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04476.html
 +
# http://web.archive.org/web/20060718091105/http://suo.ieee.org/email/msg04510.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04517.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04525.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04533.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04536.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg04542.html
 +
# http://web.archive.org/web/20050824231950/http://suo.ieee.org/email/msg04546.html
   −
| McClelland, James L. and Rumelhart, David E.,
+
===Jul 2001 &bull; Reflective Extension Of Logical Graphs===
|'Explorations in Parallel Distributed Processing:
  −
| A Handbook of Models, Programs, and Exercises',
  −
| MIT Press, Cambridge, MA, 1988.
     −
| Peirce, Charles Sanders,
+
====RefLog &bull; Arisbe List====
|'Collected Papers of Charles Sanders Peirce',
  −
| edited by Charles Hartshorne, Paul Weiss, & Arthur W. Burks,
  −
| Harvard University Press, Cambridge, MA, 1931-1960.
     −
| Peirce, Charles Sanders,
+
* http://web.archive.org/web/20150109141200/http://stderr.org/pipermail/arisbe/2001-July/thread.html#711
|'The New Elements of Mathematics',
+
# http://web.archive.org/web/20150109141000/http://stderr.org/pipermail/arisbe/2001-July/000711.html
| edited by Carolyn Eisele, Mouton, The Hague, 1976.
     −
|'Charles S. Peirce:  Selected Writings; Values in a Universe of Chance',
+
====RefLog &bull; SUO List====
| edited by Philip P. Wiener, Dover, New York, NY, 1966.
     −
| Spencer Brown, George,
+
* http://web.archive.org/web/20070302133623/http://suo.ieee.org/email/thrd154.html#05694
|'Laws of Form',
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/email/msg05694.html
| George Allen & Unwin, London, UK, 1969.
     −
| Van Hentenryck, Pascal,
+
===Dec 2001 &bull; Functional Conception Of Quantificational Logic===
|'Constraint Satisfaction in Logic Programming',
  −
| MIT Press, Cambridge, MA, 1989.
     −
| Wilf, Herbert S.,
+
====FunLog &bull; Arisbe List====
|'Algorithms and Complexity',
  −
| Prentice-Hall, Englewood Cliffs, NJ, 1986.
     −
| Winston, Patrick Henry,
+
* http://web.archive.org/web/20141005034441/http://stderr.org/pipermail/arisbe/2001-December/thread.html#1212
|'Artificial Intelligence, 2nd ed.,
+
# http://web.archive.org/web/20141005034614/http://stderr.org/pipermail/arisbe/2001-December/001212.html
| Addison-Wesley, Reading, MA, 1984.
+
# http://web.archive.org/web/20141005034615/http://stderr.org/pipermail/arisbe/2001-December/001213.html
 +
# http://web.archive.org/web/20051202034557/http://stderr.org/pipermail/arisbe/2001-December/001216.html
 +
# http://web.archive.org/web/20051202074331/http://stderr.org/pipermail/arisbe/2001-December/001221.html
 +
# http://web.archive.org/web/20051201235028/http://stderr.org/pipermail/arisbe/2001-December/001222.html
 +
# http://web.archive.org/web/20051202052037/http://stderr.org/pipermail/arisbe/2001-December/001223.html
 +
# http://web.archive.org/web/20050827214411/http://stderr.org/pipermail/arisbe/2001-December/001224.html
 +
# http://web.archive.org/web/20051202092500/http://stderr.org/pipermail/arisbe/2001-December/001225.html
 +
# http://web.archive.org/web/20051202051942/http://stderr.org/pipermail/arisbe/2001-December/001226.html
 +
# http://web.archive.org/web/20050425140213/http://stderr.org/pipermail/arisbe/2001-December/001227.html
   −
| Wirth, Niklaus,
+
====FunLog &bull; Ontology List====
|'Algorithms + Data Structures = Programs',
  −
| Prentice-Hall, Englewood Cliffs, NJ, 1976.
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
* http://web.archive.org/web/20120222171225/http://suo.ieee.org/ontology/thrd38.html#03562
 +
# http://web.archive.org/web/20110608022546/http://suo.ieee.org/ontology/msg03562.html
 +
# http://web.archive.org/web/20110608022712/http://suo.ieee.org/ontology/msg03563.html
 +
# http://web.archive.org/web/20110608023312/http://suo.ieee.org/ontology/msg03564.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03565.html
 +
# http://web.archive.org/web/20070812011325/http://suo.ieee.org/ontology/msg03569.html
 +
# http://web.archive.org/web/20110608023228/http://suo.ieee.org/ontology/msg03570.html
 +
# http://web.archive.org/web/20110608022616/http://suo.ieee.org/ontology/msg03568.html
 +
# http://web.archive.org/web/20110608023557/http://suo.ieee.org/ontology/msg03572.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03577.html
 +
# http://web.archive.org/web/20070317021141/http://suo.ieee.org/ontology/msg03578.html
 +
# http://web.archive.org/web/20110608021549/http://suo.ieee.org/ontology/msg03579.html
 +
# http://web.archive.org/web/20110608021332/http://suo.ieee.org/ontology/msg03580.html
 +
# http://web.archive.org/web/20110608020250/http://suo.ieee.org/ontology/msg03581.html
 +
# http://web.archive.org/web/20110608021344/http://suo.ieee.org/ontology/msg03582.html
 +
# http://web.archive.org/web/20110608021557/http://suo.ieee.org/ontology/msg03583.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg04247.html
   −
Cactus Town Cartoons
+
===Dec 2001 &bull; Cactus Language===
   −
01.  http://suo.ieee.org/ontology/msg03567.html
+
====Cactus Town Cartoons &bull; Arisbe List====
02.  http://suo.ieee.org/ontology/msg03571.html
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
* http://web.archive.org/web/20141005034441/http://stderr.org/pipermail/arisbe/2001-December/thread.html#1214
 +
# http://web.archive.org/web/20050825005438/http://stderr.org/pipermail/arisbe/2001-December/001214.html
 +
# http://web.archive.org/web/20051202101235/http://stderr.org/pipermail/arisbe/2001-December/001217.html
   −
Differential Analytic Turing Automata (DATA)
+
====Cactus Town Cartoons &bull; Ontology List====
   −
01. http://suo.ieee.org/ontology/msg00596.html
+
* http://web.archive.org/web/20120222171225/http://suo.ieee.org/ontology/thrd38.html#03567
02. http://suo.ieee.org/ontology/msg00618.html
+
# http://web.archive.org/web/20110608023426/http://suo.ieee.org/ontology/msg03567.html
 +
# http://web.archive.org/web/20110608024449/http://suo.ieee.org/ontology/msg03571.html
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
===Jan 2002 &bull; Zeroth Order Theories===
   −
Differential Logic
+
====ZOT &bull; Arisbe List====
   −
01. http://suo.ieee.org/ontology/msg04040.html
+
* http://web.archive.org/web/20150109041904/http://stderr.org/pipermail/arisbe/2002-January/thread.html#1293
02. http://suo.ieee.org/ontology/msg04041.html
+
# http://web.archive.org/web/20150109042401/http://stderr.org/pipermail/arisbe/2002-January/001293.html
03. http://suo.ieee.org/ontology/msg04045.html
+
# http://web.archive.org/web/20150109042402/http://stderr.org/pipermail/arisbe/2002-January/001294.html
04. http://suo.ieee.org/ontology/msg04046.html
+
# http://web.archive.org/web/20050503213326/http://stderr.org/pipermail/arisbe/2002-January/001295.html
05. http://suo.ieee.org/ontology/msg04047.html
+
# http://web.archive.org/web/20050503213330/http://stderr.org/pipermail/arisbe/2002-January/001296.html
06. http://suo.ieee.org/ontology/msg04048.html
+
# http://web.archive.org/web/20050504070444/http://stderr.org/pipermail/arisbe/2002-January/001299.html
07. http://suo.ieee.org/ontology/msg04052.html
+
# http://web.archive.org/web/20050504070430/http://stderr.org/pipermail/arisbe/2002-January/001300.html
08. http://suo.ieee.org/ontology/msg04054.html
+
# http://web.archive.org/web/20050504070700/http://stderr.org/pipermail/arisbe/2002-January/001301.html
09. http://suo.ieee.org/ontology/msg04055.html
+
# http://web.archive.org/web/20050504070704/http://stderr.org/pipermail/arisbe/2002-January/001302.html
10. http://suo.ieee.org/ontology/msg04067.html
+
# http://web.archive.org/web/20050504070712/http://stderr.org/pipermail/arisbe/2002-January/001304.html
11. http://suo.ieee.org/ontology/msg04068.html
+
# http://web.archive.org/web/20050504070717/http://stderr.org/pipermail/arisbe/2002-January/001305.html
12. http://suo.ieee.org/ontology/msg04069.html
+
# http://web.archive.org/web/20050504070722/http://stderr.org/pipermail/arisbe/2002-January/001306.html
13. http://suo.ieee.org/ontology/msg04070.html
+
# http://web.archive.org/web/20050504070726/http://stderr.org/pipermail/arisbe/2002-January/001308.html
14. http://suo.ieee.org/ontology/msg04072.html
+
# http://web.archive.org/web/20050504070730/http://stderr.org/pipermail/arisbe/2002-January/001309.html
15. http://suo.ieee.org/ontology/msg04073.html
+
# http://web.archive.org/web/20050504070434/http://stderr.org/pipermail/arisbe/2002-January/001310.html
16. http://suo.ieee.org/ontology/msg04074.html
+
# http://web.archive.org/web/20050504070742/http://stderr.org/pipermail/arisbe/2002-January/001313.html
17.  http://suo.ieee.org/ontology/msg04077.html
+
# http://web.archive.org/web/20050504070746/http://stderr.org/pipermail/arisbe/2002-January/001314.html
18. http://suo.ieee.org/ontology/msg04079.html
+
# http://web.archive.org/web/20050504070438/http://stderr.org/pipermail/arisbe/2002-January/001315.html
19. http://suo.ieee.org/ontology/msg04080.html
+
# http://web.archive.org/web/20050504070540/http://stderr.org/pipermail/arisbe/2002-January/001316.html
20.
+
# http://web.archive.org/web/20050504070750/http://stderr.org/pipermail/arisbe/2002-January/001317.html
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
====ZOT &bull; Arisbe List &bull; Discussion====
   −
Extensions Of Logical Graphs
+
* http://web.archive.org/web/20150109041904/http://stderr.org/pipermail/arisbe/2002-January/thread.html#1293
 +
# http://web.archive.org/web/20050503213334/http://stderr.org/pipermail/arisbe/2002-January/001297.html
 +
# http://web.archive.org/web/20050504070656/http://stderr.org/pipermail/arisbe/2002-January/001298.html
 +
# http://web.archive.org/web/20050504070708/http://stderr.org/pipermail/arisbe/2002-January/001303.html
 +
# http://web.archive.org/web/20050504070544/http://stderr.org/pipermail/arisbe/2002-January/001307.html
 +
# http://web.archive.org/web/20050504070734/http://stderr.org/pipermail/arisbe/2002-January/001311.html
 +
# http://web.archive.org/web/20050504070738/http://stderr.org/pipermail/arisbe/2002-January/001312.html
 +
# http://web.archive.org/web/20050504070755/http://stderr.org/pipermail/arisbe/2002-January/001318.html
   −
01.  http://www.virtual-earth.de/CG/cg-list/old/msg03351.html
+
====ZOT &bull; Ontology List====
02.  http://www.virtual-earth.de/CG/cg-list/old/msg03352.html
  −
03.  http://www.virtual-earth.de/CG/cg-list/old/msg03353.html
  −
04.  http://www.virtual-earth.de/CG/cg-list/old/msg03354.html
  −
05.  http://www.virtual-earth.de/CG/cg-list/old/msg03376.html
  −
06.  http://www.virtual-earth.de/CG/cg-list/old/msg03379.html
  −
07.  http://www.virtual-earth.de/CG/cg-list/old/msg03381.html
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
* http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/thrd35.html#03680
 +
# http://web.archive.org/web/20070323210742/http://suo.ieee.org/ontology/msg03680.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03681.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03682.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03683.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03691.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03693.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03695.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03696.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03701.html
 +
# http://web.archive.org/web/20070329211521/http://suo.ieee.org/ontology/msg03702.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03703.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03706.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03707.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03708.html
 +
# http://web.archive.org/web/20080620074722/http://suo.ieee.org/ontology/msg03712.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03715.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03716.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03717.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03718.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03721.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03722.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03723.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03724.html
   −
Functional Conception Of Quantificational Logic
+
====ZOT &bull; Ontology List &bull; Discussion====
   −
01. http://suo.ieee.org/ontology/msg03562.html
+
* http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/thrd35.html#03680
02. http://suo.ieee.org/ontology/msg03563.html
+
* http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/thrd35.html#03697
03. http://suo.ieee.org/ontology/msg03577.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03684.html
04. http://suo.ieee.org/ontology/msg03578.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03685.html
05. http://suo.ieee.org/ontology/msg03579.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03686.html
06. http://suo.ieee.org/ontology/msg03580.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03687.html
07. http://suo.ieee.org/ontology/msg03581.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03689.html
08. http://suo.ieee.org/ontology/msg03582.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03690.html
09. http://suo.ieee.org/ontology/msg03583.html
+
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03694.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03697.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03698.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03699.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03700.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03704.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03705.html
 +
# http://web.archive.org/web/20070330093628/http://suo.ieee.org/ontology/msg03709.html
 +
# http://web.archive.org/web/20080705071714/http://suo.ieee.org/ontology/msg03710.html
 +
# http://web.archive.org/web/20080620010020/http://suo.ieee.org/ontology/msg03711.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03713.html
 +
# http://web.archive.org/web/20080620074749/http://suo.ieee.org/ontology/msg03714.html
 +
# http://web.archive.org/web/20061005100254/http://suo.ieee.org/ontology/msg03719.html
 +
# http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg03720.html
   −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
===Mar 2003 &bull; Theme One Program &bull; Logical Cacti===
   −
Propositional Equation Reasoning Systems (PERS)
+
* http://web.archive.org/web/20150224210000/http://stderr.org/pipermail/inquiry/2003-March/thread.html#102
 +
* http://web.archive.org/web/20150224210000/http://stderr.org/pipermail/inquiry/2003-March/thread.html#114
 +
# http://web.archive.org/web/20081007043317/http://stderr.org/pipermail/inquiry/2003-March/000114.html
 +
# http://web.archive.org/web/20080908075558/http://stderr.org/pipermail/inquiry/2003-March/000115.html
 +
# http://web.archive.org/web/20080908080336/http://stderr.org/pipermail/inquiry/2003-March/000116.html
   −
01.  http://suo.ieee.org/email/msg04187.html
+
===Feb 2005 &bull; Theme One Program &bull; Logical Cacti===
02.  http://suo.ieee.org/email/msg04305.html
  −
03.  http://suo.ieee.org/email/msg04413.html
  −
04.  http://suo.ieee.org/email/msg04419.html
  −
05.  http://suo.ieee.org/email/msg04422.html
  −
06.  http://suo.ieee.org/email/msg04423.html
  −
07.  http://suo.ieee.org/email/msg04432.html
  −
08.  http://suo.ieee.org/email/msg04454.html
  −
09.  http://suo.ieee.org/email/msg04455.html
  −
10.  http://suo.ieee.org/email/msg04476.html
  −
11.  http://suo.ieee.org/email/msg04510.html
  −
12.  http://suo.ieee.org/email/msg04517.html
  −
13.  http://suo.ieee.org/email/msg04525.html
  −
14.  http://suo.ieee.org/email/msg04533.html
  −
15.  http://suo.ieee.org/email/msg04536.html
  −
16.  http://suo.ieee.org/email/msg04542.html
  −
17.  http://suo.ieee.org/email/msg04546.html
     −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
* http://web.archive.org/web/20150109155110/http://stderr.org/pipermail/inquiry/2005-February/thread.html#2348
 +
* http://web.archive.org/web/20150109155110/http://stderr.org/pipermail/inquiry/2005-February/thread.html#2360
 +
# http://web.archive.org/web/20150109152359/http://stderr.org/pipermail/inquiry/2005-February/002360.html
 +
# http://web.archive.org/web/20150109152401/http://stderr.org/pipermail/inquiry/2005-February/002361.html
 +
# http://web.archive.org/web/20061013233259/http://stderr.org/pipermail/inquiry/2005-February/002362.html
 +
# http://web.archive.org/web/20081121103109/http://stderr.org/pipermail/inquiry/2005-February/002363.html
   −
Reflective Extension Of Logical Graphs (RefLog)
+
[[Category:Artificial Intelligence]]
 
+
[[Category:Charles Sanders Peirce]]
01.  http://suo.ieee.org/email/msg05694.html
+
[[Category:Combinatorics]]
 
+
[[Category:Computer Science]]
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
[[Category:Cybernetics]]
 
+
[[Category:Equational Reasoning]]
Sequential Interactions Generating Hypotheses
+
[[Category:Formal Languages]]
 
+
[[Category:Formal Systems]]
01.  http://suo.ieee.org/email/msg02607.html
+
[[Category:Graph Theory]]
02.  http://suo.ieee.org/email/msg02608.html
+
[[Category:Knowledge Representation]]
03.  http://suo.ieee.org/email/msg03183.html
+
[[Category:Logic]]
 
+
[[Category:Logical Graphs]]
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
+
[[Category:Mathematics]]
 
+
[[Category:Philosophy]]
Sowa's Top Level Categories
+
[[Category:Semiotics]]
 
+
[[Category:Visualization]]
01.  http://suo.ieee.org/email/msg01949.html
  −
02.  http://suo.ieee.org/email/msg01956.html
  −
03.  http://suo.ieee.org/email/msg01966.html
  −
 
  −
04.  http://suo.ieee.org/ontology/msg00048.html
  −
05.  http://suo.ieee.org/ontology/msg00051.html
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Zeroth Order Logic (ZOL)
  −
 
  −
01.  http://suo.ieee.org/email/msg01246.html
  −
02.  http://suo.ieee.org/email/msg01406.html
  −
03.  http://suo.ieee.org/email/msg01546.html
  −
04.  http://suo.ieee.org/email/msg01561.html
  −
05.  http://suo.ieee.org/email/msg01670.html
  −
06.  http://suo.ieee.org/email/msg01739.html
  −
07.  http://suo.ieee.org/email/msg01966.html
  −
08.  http://suo.ieee.org/email/msg01985.html
  −
09.  http://suo.ieee.org/email/msg01988.html
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
 
  −
Zeroth Order Theories (ZOT's)
  −
 
  −
01.  http://suo.ieee.org/ontology/msg03680.html
  −
02.  http://suo.ieee.org/ontology/msg03681.html
  −
03.  http://suo.ieee.org/ontology/msg03682.html
  −
04.  http://suo.ieee.org/ontology/msg03683.html
  −
05.  http://suo.ieee.org/ontology/msg03685.html
  −
06.  http://suo.ieee.org/ontology/msg03687.html
  −
07.  http://suo.ieee.org/ontology/msg03689.html
  −
08.  http://suo.ieee.org/ontology/msg03691.html
  −
09.  http://suo.ieee.org/ontology/msg03693.html
  −
10.  http://suo.ieee.org/ontology/msg03694.html
  −
11.  http://suo.ieee.org/ontology/msg03695.html
  −
12.  http://suo.ieee.org/ontology/msg03696.html
  −
13.  http://suo.ieee.org/ontology/msg03700.html
  −
14.  http://suo.ieee.org/ontology/msg03701.html
  −
15.  http://suo.ieee.org/ontology/msg03702.html
  −
16.  http://suo.ieee.org/ontology/msg03703.html
  −
17.  http://suo.ieee.org/ontology/msg03705.html
  −
18.  http://suo.ieee.org/ontology/msg03706.html
  −
19.  http://suo.ieee.org/ontology/msg03707.html
  −
20.  http://suo.ieee.org/ontology/msg03708.html
  −
21.  http://suo.ieee.org/ontology/msg03709.html
  −
22.  http://suo.ieee.org/ontology/msg03711.html
  −
23.  http://suo.ieee.org/ontology/msg03712.html
  −
24.  http://suo.ieee.org/ontology/msg03715.html
  −
25.  http://suo.ieee.org/ontology/msg03716.html
  −
26.  http://suo.ieee.org/ontology/msg03717.html
  −
27.  http://suo.ieee.org/ontology/msg03718.html
  −
28.  http://suo.ieee.org/ontology/msg03720.html
  −
29.  http://suo.ieee.org/ontology/msg03721.html
  −
30.  http://suo.ieee.org/ontology/msg03722.html
  −
31.  http://suo.ieee.org/ontology/msg03723.html
  −
32.  http://suo.ieee.org/ontology/msg03724.html
  −
 
  −
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
  −
</pre>
 
12,080

edits