Changes

update
Line 117: Line 117:  
3 & 1 & 0 & 0 \\
 
3 & 1 & 0 & 0 \\
 
4 & 1 & 0 & 0 \\
 
4 & 1 & 0 & 0 \\
5 & {}^{\shortparallel} & {}^{\shortparallel} & {}^{\shortparallel}
+
5 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel
 
\end{array}</math>
 
\end{array}</math>
 
|}
 
|}
Line 134: Line 134:  
3 & 1 & 0 & 0 \\
 
3 & 1 & 0 & 0 \\
 
4 & 1 & 0 & 0 \\
 
4 & 1 & 0 & 0 \\
5 & {}^{\shortparallel} & {}^{\shortparallel} & {}^{\shortparallel}
+
5 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel
 
\end{array}</math>
 
\end{array}</math>
 
|}
 
|}
Line 1,622: Line 1,622:  
===Computation Summary for Logical Equality===
 
===Computation Summary for Logical Equality===
   −
Figure&nbsp;2.1 shows the expansion of <math>g = \texttt{((u, v))}</math> over <math>[u, v]\!</math> to produce the expression:
+
Figure&nbsp;2.1 shows the expansion of <math>g = \texttt{((} u \texttt{,~} v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression:
    
{| align="center" cellpadding="8" width="90%"
 
{| align="center" cellpadding="8" width="90%"
| <math>\texttt{uv} ~+~ \texttt{(u)(v)}</math>
+
|
 +
<math>\begin{matrix}
 +
uv & + & \texttt{(} u \texttt{)(} v \texttt{)}
 +
\end{matrix}</math>
 
|}
 
|}
   −
Figure&nbsp;2.2 shows the expansion of <math>\mathrm{E}g = \texttt{((u + du, v + dv))}</math> over <math>[u, v]\!</math> to produce the expression:
+
Figure&nbsp;2.2 shows the expansion of <math>\mathrm{E}g = \texttt{((} u + \mathrm{d}u \texttt{,~} v + \mathrm{d}v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression:
   −
{| align="center" cellpadding="8" width="90%"
+
{| align="center" cellpadding="8" width="90%"
| <math>\texttt{uv} \cdot \texttt{((du, dv))} + \texttt{u(v)} \cdot \texttt{(du, dv)} + \texttt{(u)v} \cdot \texttt{(du, dv)} + \texttt{(u)(v)} \cdot \texttt{((du, dv))}</math>
+
|
 +
<math>\begin{matrix}
 +
uv \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))} & + &
 +
u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + &
 +
\texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + &
 +
\texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))}
 +
\end{matrix}</math>
 
|}
 
|}
   −
<math>\mathrm{E}g</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>g\!</math> is true.  In this case, where the prevailing proposition <math>g\!</math> is <math>\texttt{((u, v))},</math> the component <math>\texttt{uv} \cdot \texttt{((du, dv))}</math> of <math>\mathrm{E}g</math> tells you this:  If <math>u\!</math> and <math>v\!</math> are both true where you are, then change either both or neither of <math>u\!</math> and <math>v\!</math> at the same time, and you will attain a place where <math>\texttt{((du, dv))}</math> is true.
+
In general, <math>\mathrm{E}g\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>g\!</math> is true.  In this case, where the prevailing proposition <math>g\!</math> is <math>\texttt{((} u \texttt{,~} v \texttt{))},\!</math> the component <math>uv \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))}\!</math> of <math>\mathrm{E}g\!</math> tells you this:  If <math>u\!</math> and <math>v\!</math> are both true where you are, then change either both or neither of <math>u\!</math> and <math>v\!</math> at the same time, and you will attain a place where <math>\texttt{((} u \texttt{,~} v \texttt{))}\!</math> is true.
   −
Figure&nbsp;2.3 shows the expansion of <math>\mathrm{D}g</math> over <math>[u, v]\!</math> to produce the expression:
+
Figure&nbsp;2.3 shows the expansion of <math>\mathrm{D}g\!</math> over <math>[u, v]\!</math> to produce the expression:
   −
{| align="center" cellpadding="8" width="90%"
+
{| align="center" cellpadding="8" width="90%"
| <math>\texttt{uv} \cdot \texttt{(du, dv)} ~+~ \texttt{u(v)} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)v} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)(v)} \cdot \texttt{(du, dv)}</math>
+
|
 +
<math>\begin{matrix}
 +
uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + &
 +
u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + &
 +
\texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + &
 +
\texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
 +
\end{matrix}</math>
 
|}
 
|}
   −
<math>\mathrm{D}g</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>g,\!</math> that is, if you want to get to a place where the value of <math>g\!</math> is different from what it is where you are.  In the present case, where the ruling proposition <math>g\!</math> is <math>\texttt{((u, v))},</math> the term <math>\texttt{uv} \cdot \texttt{(du, dv)}</math> of <math>\mathrm{D}g</math> tells you this:  If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change one or the other but not both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>g\!</math> is different from what it is where you are.
+
In general, <math>\mathrm{D}g\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>g,\!</math> that is, if you want to get to a place where the value of <math>g\!</math> is different from what it is where you are.  In the present case, where the ruling proposition <math>g\!</math> is <math>\texttt{((} u \texttt{,~} v \texttt{))},\!</math> the term <math>uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}\!</math> of <math>\mathrm{D}g\!</math> tells you this:  If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change one or the other but not both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>g\!</math> is different from what it is where you are.
   −
Figure&nbsp;2.4 approximates <math>\mathrm{D}g</math> by the linear form <math>\mathrm{d}g</math> that expands over <math>[u, v]\!</math> as follows:
+
Figure&nbsp;2.4 approximates <math>\mathrm{D}g\!</math> by the linear form <math>{\mathrm{d}g}\!</math> that expands over <math>[u, v]\!</math> as follows:
    
{| align="center" cellpadding="8" width="90%"
 
{| align="center" cellpadding="8" width="90%"
 
|
 
|
<math>\begin{array}{lll}
+
<math>\begin{array}{*{9}{l}}
 
\mathrm{d}g
 
\mathrm{d}g
& = & \texttt{uv}\!\cdot\!\texttt{(du, dv)} + \texttt{u(v)}\!\cdot\!\texttt{(du, dv)} + \texttt{(u)v}\!\cdot\!\texttt{(du, dv)} + \texttt{(u)(v)}\!\cdot\!\texttt{(du, dv)}
+
& = & uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
\\ \\
+
& + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
& = & \texttt{(du, dv)}
+
& + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
 +
& + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
 +
\\[8pt]
 +
& = & \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}
 
\end{array}</math>
 
\end{array}</math>
 
|}
 
|}
   −
Figure&nbsp;2.5 shows what remains of the difference map <math>\mathrm{D}g</math> when the first order linear contribution <math>\mathrm{d}g</math> is removed, namely:
+
Figure&nbsp;2.5 shows what remains of the difference map <math>\mathrm{D}g\!</math> when the first order linear contribution <math>{\mathrm{d}g}\!</math> is removed, namely:
    
{| align="center" cellpadding="8" width="90%"
 
{| align="center" cellpadding="8" width="90%"
 
|
 
|
<math>\begin{matrix}
+
<math>\begin{array}{*{9}{l}}
 
\mathrm{r}g
 
\mathrm{r}g
& = & \texttt{uv} \cdot \texttt{0} & + & \texttt{u(v)} \cdot \texttt{0} & + & \texttt{(u)v} \cdot \texttt{0} & + & \texttt{(u)(v)} \cdot \texttt{0}
+
& = & uv \cdot 0
\\ \\
+
& + & u \texttt{(} v \texttt{)} \cdot 0
& = & \texttt{0}
+
& + & \texttt{(} u \texttt{)} v \cdot 0
\end{matrix}</math>
+
& + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot 0
 +
\\[8pt]
 +
& = & 0
 +
\end{array}</math>
 
|}
 
|}
   Line 1,711: Line 1,732:  
</pre>
 
</pre>
 
|}
 
|}
  −
<br>
      
{| align="center" border="0" cellpadding="10"
 
{| align="center" border="0" cellpadding="10"
Line 1,757: Line 1,776:  
</pre>
 
</pre>
 
|}
 
|}
  −
<br>
      
{| align="center" border="0" cellpadding="10"
 
{| align="center" border="0" cellpadding="10"
Line 1,803: Line 1,820:  
</pre>
 
</pre>
 
|}
 
|}
  −
<br>
      
{| align="center" border="0" cellpadding="10"
 
{| align="center" border="0" cellpadding="10"
Line 1,849: Line 1,864:  
</pre>
 
</pre>
 
|}
 
|}
  −
<br>
      
{| align="center" border="0" cellpadding="10"
 
{| align="center" border="0" cellpadding="10"
Line 1,907: Line 1,920:  
==Visualization==
 
==Visualization==
   −
In my work on [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]], I found it useful to develop several different ways of visualizing logical transformations, indeed, I devised four distinct styles of picture for the job.  Thus far in our work on the mapping <math>F : [u, v] \to [u, v],\!</math> we've been making use of what I call the ''areal view'' of the extended universe of discourse, <math>[u, v, du, dv],\!</math> but as the number of dimensions climbs beyond four, it's time to bid this genre adieu, and look for a style that can scale a little better.  At any rate, before we proceed any further, let's first assemble the information that we have gathered about <math>F\!</math> from several different angles, and see if it can be fitted into a coherent picture of the transformation <math>F : (u, v) \mapsto ( ~\texttt{((u)(v))}~, ~\texttt{((u, v))}~ ).</math>
+
In my work on [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]], I found it useful to develop several different ways of visualizing logical transformations, indeed, I devised four distinct styles of picture for the job.  Thus far in our work on the mapping <math>F : [u, v] \to [u, v],\!</math> we've been making use of what I call the ''areal view'' of the extended universe of discourse, <math>[u, v, \mathrm{d}u, \mathrm{d}v],\!</math> but as the number of dimensions climbs beyond four, it's time to bid this genre adieu and look for a style that can scale a little better.  At any rate, before we proceed any further, let's first assemble the information that we have gathered about <math>F\!</math> from several different angles, and see if it can be fitted into a coherent picture of the transformation <math>F : (u, v) \mapsto ( ~ \texttt{((} u \texttt{)(} v \texttt{))} ~,~ \texttt{((} u \texttt{,~} v \texttt{))} ~ ).\!</math>
    
In our first crack at the transformation <math>F,\!</math> we simply plotted the state transitions and applied the utterly stock technique of calculating the finite differences.
 
In our first crack at the transformation <math>F,\!</math> we simply plotted the state transitions and applied the utterly stock technique of calculating the finite differences.
Line 1,916: Line 1,929:  
|
 
|
 
<math>\begin{array}{c|cc|cc|}
 
<math>\begin{array}{c|cc|cc|}
t & u & v & du & dv \\[8pt]
+
t & u & v & \mathrm{d}u & \mathrm{d}v \\[8pt]
0 & 1 & 1 & 0 & 0 \\
+
0 & 1 & 1 & 0 & 0 \\
1 & '' & '' & '' & '' \\
+
1 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel
 
\end{array}</math>
 
\end{array}</math>
 
|}
 
|}
   −
A quick inspection of the first Table suggests a rule to cover the case when <math>\texttt{u~=~v~=~1},</math> namely, <math>\texttt{du~=~dv~=~0}.</math>  To put it another way, the Table characterizes Orbit&nbsp;1 by means of the data:  <math>(u, v, du, dv) = (1, 1, 0, 0).\!</math>  Another way to convey the same information is by means of the extended proposition:  <math>\texttt{u~v~(du)(dv)}.</math>
+
A quick inspection of the first Table suggests a rule to cover the case when <math>u = v = 1,\!</math> namely, <math>\mathrm{d}u  = \mathrm{d}v = 0.\!</math>  To put it another way, the Table characterizes Orbit&nbsp;1 by means of the data:  <math>(u, v, \mathrm{d}u, \mathrm{d}v) = (1, 1, 0, 0).\!</math>  Another way to convey the same information is by means of the extended proposition:  <math>u v \texttt{(} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{)}.\!</math>
    
{| align="center" cellpadding="8" style="text-align:center"
 
{| align="center" cellpadding="8" style="text-align:center"
Line 1,929: Line 1,942:  
|
 
|
 
<math>\begin{array}{c|cc|cc|cc|}
 
<math>\begin{array}{c|cc|cc|cc|}
t & u & v & du & dv & d^2 u & d^2 v \\[8pt]
+
t & u & v & \mathrm{d}u & \mathrm{d}v & \mathrm{d}^2 u & \mathrm{d}^2 v \\[8pt]
0 & 0 & 0 & 0 & 1 &     1 &     0 \\
+
0 & 0 & 0 & 0 & 1 & 1 & 0 \\
1 & 0 & 1 & 1 & 1 &     1 &     1 \\
+
1 & 0 & 1 & 1 & 1 & 1 & 1 \\
2 & 1 & 0 & 0 & 0 &     0 &     0 \\
+
2 & 1 & 0 & 0 & 0 & 0 & 0 \\
3 & '' & '' & '' & '' &   '' &   '' \\
+
3 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel
 
\end{array}</math>
 
\end{array}</math>
 
|}
 
|}
   −
A more fine combing of the second Table brings to mind a rule that partly covers the remaining cases, that is, <math>\texttt{du~=~v}, ~\texttt{dv~=~(u)}.</math>  This much information about Orbit&nbsp;2 is also encapsulated by the extended proposition, <math>\texttt{(uv)((du, v))(dv, u)},</math> which says that <math>u\!</math> and <math>v\!</math> are not both true at the same time, while <math>du\!</math> is equal in value to <math>v\!</math> and <math>dv\!</math> is opposite in value to <math>u.\!</math>
+
A more fine combing of the second Table brings to mind a rule that partly covers the remaining cases, that is, <math>\mathrm{d}u = v, ~ \mathrm{d}v = \texttt{(} u \texttt{)}.\!</math>  This much information about Orbit&nbsp;2 is also encapsulated by the extended proposition <math>\texttt{(} uv \texttt{)((} \mathrm{d}u \texttt{,} v \texttt{))(} \mathrm{d}v, u \texttt{)},\!</math> which says that <math>u\!</math> and <math>v\!</math> are not both true at the same time, while <math>\mathrm{d}u\!</math> is equal in value to <math>v\!</math> and <math>\mathrm{d}v\!</math> is opposite in value to <math>u.\!</math>
    
==Turing Machine Example==
 
==Turing Machine Example==
 +
 +
<font size="3">&#9758;</font> See [[Theme One Program]] for documentation of the cactus graph syntax and the propositional modeling program used below.
    
By way of providing a simple illustration of Cook's Theorem, namely, that &ldquo;Propositional Satisfiability is NP-Complete&rdquo;, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed.
 
By way of providing a simple illustration of Cook's Theorem, namely, that &ldquo;Propositional Satisfiability is NP-Complete&rdquo;, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed.
Line 1,955: Line 1,970:  
A turing machine for computing the parity of a bit string is described by means of the following Figure and Table.
 
A turing machine for computing the parity of a bit string is described by means of the following Figure and Table.
   −
<br>
+
{| align="center" border="0" cellspacing="10" style="text-align:center; width:100%"
 
+
| [[Image:Parity_Machine.jpg|400px]]
{| align="center" border="0" cellpadding="10"
+
|-
|
+
| height="20px" valign="top" | <math>\text{Figure 3.} ~~ \text{Parity Machine}\!</math>
<pre>
  −
o-------------------------------------------------o
  −
|                                                 |
  −
|                     1/1/+1                      |
  −
|                    -------->                    |
  −
|               /\ /        \ /\                |
  −
|      0/0/+1  ^  0          1  ^  0/0/+1      |
  −
|                \/|\        /|\/                |
  −
|                  | <-------- |                  |
  −
|          #/#/-1  |  1/1/+1  |  #/#/-1          |
  −
|                  |          |                  |
  −
|                  v          v                  |
  −
|                  #          *                  |
  −
|                                                |
  −
o-------------------------------------------------o
  −
Figure 21-a. Parity Machine
  −
</pre>
   
|}
 
|}
   Line 1,983: Line 1,981:  
|
 
|
 
<pre>
 
<pre>
Table 21-b.  Parity Machine
+
Table 4.  Parity Machine
 
o-------o--------o-------------o---------o------------o
 
o-------o--------o-------------o---------o------------o
 
| State | Symbol | Next Symbol | Ratchet | Next State |
 
| State | Symbol | Next Symbol | Ratchet | Next State |
Line 2,881: Line 2,879:     
The output of <math>\mathrm{Stunt}(2)</math> being the symbol that rests under the tape head <math>\mathrm{H}</math> when and if the machine <math>\mathrm{M}</math> reaches one of its resting states, we get the result that <math>\mathrm{Parity}(1) = 1.</math>
 
The output of <math>\mathrm{Stunt}(2)</math> being the symbol that rests under the tape head <math>\mathrm{H}</math> when and if the machine <math>\mathrm{M}</math> reaches one of its resting states, we get the result that <math>\mathrm{Parity}(1) = 1.</math>
  −
==Work Area==
  −
  −
<pre>
  −
DATA 20.  http://forum.wolframscience.com/showthread.php?postid=791#post791
  −
  −
Let's see how this information about the transformation F,
  −
arrived at by eyeballing the raw data, comports with what
  −
we derived through a more systematic symbolic computation.
  −
  −
The results of the various operator actions that we have just
  −
computed are summarized in Tables 66-i and 66-ii from my paper,
  −
and I have attached these as a text file below.
  −
  −
Table 66-i.  Computation Summary for f<u, v> = ((u)(v))
  −
o--------------------------------------------------------------------------------o
  −
|                                                                                |
  −
| !e!f  =  uv.    1      + u(v).    1      + (u)v.    1      + (u)(v).    0      |
  −
|                                                                                |
  −
|  Ef  =  uv. (du  dv)  + u(v). (du (dv)) + (u)v.((du) dv)  + (u)(v).((du)(dv)) |
  −
|                                                                                |
  −
|  Df  =  uv.  du  dv  + u(v).  du (dv)  + (u)v. (du) dv  + (u)(v).((du)(dv)) |
  −
|                                                                                |
  −
|  df  =  uv.    0      + u(v).  du      + (u)v.      dv  + (u)(v). (du, dv)  |
  −
|                                                                                |
  −
|  rf  =  uv.  du  dv  + u(v).  du  dv  + (u)v.  du  dv  + (u)(v).  du  dv  |
  −
|                                                                                |
  −
o--------------------------------------------------------------------------------o
  −
  −
Table 66-ii.  Computation Summary for g<u, v> = ((u, v))
  −
o--------------------------------------------------------------------------------o
  −
|                                                                                |
  −
| !e!g  =  uv.    1      + u(v).    0      + (u)v.    0      + (u)(v).    1      |
  −
|                                                                                |
  −
|  Eg  =  uv.((du, dv)) + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v).((du, dv)) |
  −
|                                                                                |
  −
|  Dg  =  uv. (du, dv)  + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v). (du, dv)  |
  −
|                                                                                |
  −
|  dg  =  uv. (du, dv)  + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v). (du, dv)  |
  −
|                                                                                |
  −
|  rg  =  uv.    0      + u(v).    0      + (u)v.    0      + (u)(v).    0      |
  −
|                                                                                |
  −
o--------------------------------------------------------------------------------o
  −
  −
  −
o---------------------------------------o
  −
|                                      |
  −
|                  o                  |
  −
|                  / \                  |
  −
|                /  \                |
  −
|                /    \                |
  −
|              o      o              |
  −
|              / \    / \              |
  −
|            /  \  /  \            |
  −
|            /    \ /    \            |
  −
|          o      o      o          |
  −
|          / \    / \    / \          |
  −
|        /  \  /  \  /  \        |
  −
|        /    \ /    \ /    \        |
  −
|      o      o      o      o      |
  −
|      / \    / \    / \    / \      |
  −
|    /  \  /  \  /  \  /  \    |
  −
|    /    \ /    \ /    \ /    \    |
  −
|  o      o      o      o      o  |
  −
|  |\    / \    / \    / \    /|  |
  −
|  | \  /  \  /  \  /  \  / |  |
  −
|  |  \ /    \ /    \ /    \ /  |  |
  −
|  |  o      o      o      o  |  |
  −
|  |  |\    / \    / \    /|  |  |
  −
|  |  | \  /  \  /  \  / |  |  |
  −
|  | u |  \ /    \ /    \ /  | v |  |
  −
|  o---+---o      o      o---+---o  |
  −
|      |    \    / \    /    |      |
  −
|      |    \  /  \  /    |      |
  −
|      | du  \ /    \ /  dv |      |
  −
|      o-------o      o-------o      |
  −
|                \    /                |
  −
|                \  /                |
  −
|                  \ /                  |
  −
|                  o                  |
  −
|                                      |
  −
o---------------------------------------o
  −
</pre>
  −
  −
==Discussion==
  −
  −
<pre>
  −
PD = Philip Dutton
  −
  −
PD: I've been watching your posts.
  −
  −
PD: I am not an expert on logic infrastructures but I find the posts
  −
    interesting (despite not understanding much of it).  I am like
  −
    the diagrams.  I have recently been trying to understand CA's
  −
    using a particular perspective:  sinks and sources.  I think
  −
    that all CA's are simply combinations of sinks and sources.
  −
    How they interact (or intrude into each other's domains)
  −
    would most likely be a result of the rules (and initial
  −
    configuration of on or off cells).
  −
  −
PD: Anyway, to be short, I "see" diamond shapes quite often in
  −
    your diagrams.  Triangles (either up or down) or diamonds
  −
    (combination of an up and down triangle) make me think
  −
    soley of sinks and sources.  I think of the diamond to
  −
    be a source which, during the course of progression,
  −
    is expanding (because it is producing) and then starts
  −
    to act as a sink  (because it consumes) -- and hence the
  −
    diamond.  I can't stop thinking about sinks and sources in
  −
    CA's and so I thought I would ask you if there is some way
  −
    to tie the two worlds together (CA's of sinks and sources
  −
    together with your differential constructs).
  −
  −
PD: Any thoughts?
  −
  −
Yes, I'm hoping that there's a lot of stuff analogous to
  −
R-world dynamics to be discovered in this B-world variety,
  −
indeed, that's kind of why I set out on this investigation --
  −
oh, gee, has it been that long? -- I guess about 1989 or so,
  −
when I started to see this "differential logic" angle on what
  −
I had previously studied in systems theory as the "qualitative
  −
approach to differential equations".  I think we used to use the
  −
words "attractor" and "basin" more often than "sink", but a source
  −
is still a source as time goes by, and I do remember using the word
  −
"sink" a lot when I was a freshperson in physics, before I got logic.
  −
  −
I have spent the last 15 years doing a funny mix of practice in stats
  −
and theory in math, but I did read early works by Von Neumann, Burks,
  −
Ulam, and later stuff by Holland on CA's.  Still, it may be a while
  −
before I have re-heated my concrete intuitions about them in the
  −
NKS way of thinking.
  −
  −
There are some fractal-looking pictures that emerge when
  −
I turn to "higher order propositional expressions" (HOPE's).
  −
I have discussed this topic elswhere on the web and can look
  −
it up now if your are interested, but I am trying to make my
  −
e-positions somewhat clearer for the NKS forum than I have
  −
tried to do before.
  −
  −
But do not hestitate to dialogue all this stuff on the boards,
  −
as that's what always seems to work the best.  What I've found
  −
works best for me, as I can hardly remember what I was writing
  −
last month without Google, is to archive a copy at one of the
  −
other Google-visible discussion lists that I'm on at present.
  −
</pre>
      
==Document History==
 
==Document History==
Line 3,056: Line 2,910:  
* http://forum.wolframscience.com/archive/topic/228-1.html
 
* http://forum.wolframscience.com/archive/topic/228-1.html
 
* http://forum.wolframscience.com/showthread.php?threadid=228
 
* http://forum.wolframscience.com/showthread.php?threadid=228
* http://forum.wolframscience.com/printthread.php?threadid=228&perpage=33
+
* http://forum.wolframscience.com/printthread.php?threadid=228&perpage=50
 
# http://forum.wolframscience.com/showthread.php?postid=664#post664
 
# http://forum.wolframscience.com/showthread.php?postid=664#post664
 
# http://forum.wolframscience.com/showthread.php?postid=666#post666
 
# http://forum.wolframscience.com/showthread.php?postid=666#post666
12,089

edits