Line 117: |
Line 117: |
| 3 & 1 & 0 & 0 \\ | | 3 & 1 & 0 & 0 \\ |
| 4 & 1 & 0 & 0 \\ | | 4 & 1 & 0 & 0 \\ |
− | 5 & {}^{\shortparallel} & {}^{\shortparallel} & {}^{\shortparallel} | + | 5 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
Line 134: |
Line 134: |
| 3 & 1 & 0 & 0 \\ | | 3 & 1 & 0 & 0 \\ |
| 4 & 1 & 0 & 0 \\ | | 4 & 1 & 0 & 0 \\ |
− | 5 & {}^{\shortparallel} & {}^{\shortparallel} & {}^{\shortparallel} | + | 5 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
Line 247: |
Line 247: |
| |+ style="height:30px" | <math>\text{Table 1.} ~~ \text{Syntax and Semantics of a Calculus for Propositional Logic}\!</math> | | |+ style="height:30px" | <math>\text{Table 1.} ~~ \text{Syntax and Semantics of a Calculus for Propositional Logic}\!</math> |
| |- style="height:40px; background:ghostwhite" | | |- style="height:40px; background:ghostwhite" |
| + | | <math>\text{Graph}\!</math> |
| | <math>\text{Expression}~\!</math> | | | <math>\text{Expression}~\!</math> |
| | <math>\text{Interpretation}\!</math> | | | <math>\text{Interpretation}\!</math> |
| | <math>\text{Other Notations}\!</math> | | | <math>\text{Other Notations}\!</math> |
| |- | | |- |
− | | | + | | height="100px" | [[Image:Cactus Node Big Fat.jpg|20px]] |
− | | <math>\text{True}\!</math> | + | | <math>~</math> |
| + | | <math>\operatorname{true}</math> |
| | <math>1\!</math> | | | <math>1\!</math> |
| |- | | |- |
− | | <math>\texttt{(~)}\!</math> | + | | height="100px" | [[Image:Cactus Spike Big Fat.jpg|20px]] |
− | | <math>\text{False}\!</math> | + | | <math>\texttt{(~)}</math> |
| + | | <math>\operatorname{false}</math> |
| | <math>0\!</math> | | | <math>0\!</math> |
| |- | | |- |
− | | <math>x\!</math> | + | | height="100px" | [[Image:Cactus A Big.jpg|20px]] |
− | | <math>x\!</math> | + | | <math>a\!</math> |
− | | <math>x\!</math> | + | | <math>a\!</math> |
| + | | <math>a\!</math> |
| |- | | |- |
− | | <math>\texttt{(} x \texttt{)}\!</math> | + | | height="120px" | [[Image:Cactus (A) Big.jpg|20px]] |
− | | <math>\text{Not}~ x\!</math> | + | | <math>\texttt{(} a \texttt{)}~</math> |
− | | | + | | <math>\operatorname{not}~ a</math> |
− | <math>\begin{matrix} | + | | <math>\lnot a \quad \bar{a} \quad \tilde{a} \quad a^\prime</math> |
− | x'
| |
− | \\
| |
− | \tilde{x} | |
− | \\ | |
− | \lnot x
| |
− | \end{matrix}\!</math>
| |
| |- | | |- |
− | | <math>x~y~z\!</math> | + | | height="100px" | [[Image:Cactus ABC Big.jpg|50px]] |
− | | <math>x ~\text{and}~ y ~\text{and}~ z\!</math> | + | | <math>a ~ b ~ c</math> |
− | | <math>x \land y \land z\!</math> | + | | <math>a ~\operatorname{and}~ b ~\operatorname{and}~ c</math> |
| + | | <math>a \land b \land c</math> |
| |- | | |- |
− | | <math>\texttt{((} x \texttt{)(} y \texttt{)(} z \texttt{))}\!</math> | + | | height="160px" | [[Image:Cactus ((A)(B)(C)) Big.jpg|65px]] |
− | | <math>x ~\text{or}~ y ~\text{or}~ z\!</math> | + | | <math>\texttt{((} a \texttt{)(} b \texttt{)(} c \texttt{))}</math> |
− | | <math>x \lor y \lor z\!</math> | + | | <math>a ~\operatorname{or}~ b ~\operatorname{or}~ c</math> |
| + | | <math>a \lor b \lor c</math> |
| |- | | |- |
− | | <math>\texttt{(} x ~ \texttt{(} y \texttt{))}\!</math> | + | | height="120px" | [[Image:Cactus (A(B)) Big.jpg|60px]] |
| + | | <math>\texttt{(} a \texttt{(} b \texttt{))}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x ~\text{implies}~ y
| + | a ~\operatorname{implies}~ b |
− | \\ | + | \\[6pt] |
− | \mathrm{If}~ x ~\text{then}~ y | + | \operatorname{if}~ a ~\operatorname{then}~ b |
| \end{matrix}</math> | | \end{matrix}</math> |
− | | <math>x \Rightarrow y\!</math> | + | | <math>a \Rightarrow b</math> |
| |- | | |- |
− | | <math>\texttt{(} x \texttt{,} y \texttt{)}\!</math> | + | | height="120px" | [[Image:Cactus (A,B) Big ISW.jpg|65px]] |
| + | | <math>\texttt{(} a \texttt{,} b \texttt{)}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x ~\text{not equal to}~ y
| + | a ~\operatorname{not~equal~to}~ b |
− | \\ | + | \\[6pt] |
− | x ~\text{exclusive or}~ y
| + | a ~\operatorname{exclusive~or}~ b |
| \end{matrix}</math> | | \end{matrix}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x \ne y
| + | a \neq b |
− | \\ | + | \\[6pt] |
− | x + y
| + | a + b |
| \end{matrix}</math> | | \end{matrix}</math> |
| |- | | |- |
− | | <math>\texttt{((} x \texttt{,} y \texttt{))}\!</math> | + | | height="160px" | [[Image:Cactus ((A,B)) Big.jpg|65px]] |
| + | | <math>\texttt{((} a \texttt{,} b \texttt{))}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x ~\text{is equal to}~ y
| + | a ~\operatorname{is~equal~to}~ b |
− | \\ | + | \\[6pt] |
− | x ~\text{if and only if}~ y
| + | a ~\operatorname{if~and~only~if}~ b |
| \end{matrix}</math> | | \end{matrix}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x = y
| + | a = b |
− | \\ | + | \\[6pt] |
− | x \Leftrightarrow y
| + | a \Leftrightarrow b |
| \end{matrix}</math> | | \end{matrix}</math> |
| |- | | |- |
− | | <math>\texttt{(} x \texttt{,} y \texttt{,} z \texttt{)}\!</math> | + | | height="120px" | [[Image:Cactus (A,B,C) Big.jpg|65px]] |
| + | | <math>\texttt{(} a \texttt{,} b \texttt{,} c \texttt{)}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | \text{Just one of} | + | \operatorname{just~one~of} |
| \\ | | \\ |
− | x, y, z
| + | a, b, c |
| \\ | | \\ |
− | \text{is false}. | + | \operatorname{is~false}. |
| \end{matrix}</math> | | \end{matrix}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x'y~z~ & \lor
| + | & \bar{a} ~ b ~ c |
| \\ | | \\ |
− | x~y'z~ & \lor
| + | \lor & a ~ \bar{b} ~ c |
| \\ | | \\ |
− | x~y~z' &
| + | \lor & a ~ b ~ \bar{c} |
| \end{matrix}</math> | | \end{matrix}</math> |
| |- | | |- |
− | | <math>\texttt{((} x \texttt{),(} y \texttt{),(} z \texttt{))}\!</math> | + | | height="160px" | [[Image:Cactus ((A),(B),(C)) Big.jpg|65px]] |
| + | | <math>\texttt{((} a \texttt{),(} b \texttt{),(} c \texttt{))}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | \text{Just one of} | + | \operatorname{just~one~of} |
| \\ | | \\ |
− | x, y, z
| + | a, b, c |
| \\ | | \\ |
− | \text{is true}. | + | \operatorname{is~true}. |
− | \\ | + | \\[6pt] |
− | &
| + | \operatorname{partition~all} |
− | \\
| |
− | \text{Partition all} | |
| \\ | | \\ |
− | \text{into}~ x, y, z. | + | \operatorname{into}~ a, b, c. |
| \end{matrix}</math> | | \end{matrix}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | x~y'z' & \lor
| + | & a ~ \bar{b} ~ \bar{c} |
| \\ | | \\ |
− | x'y~z' & \lor
| + | \lor & \bar{a} ~ b ~ \bar{c} |
| \\ | | \\ |
− | x'y'z~ &
| + | \lor & \bar{a} ~ \bar{b} ~ c |
| \end{matrix}</math> | | \end{matrix}</math> |
| |- | | |- |
| + | | height="160px" | [[Image:Cactus (A,(B,C)) Big.jpg|90px]] |
| + | | <math>\texttt{(} a \texttt{,(} b \texttt{,} c \texttt{))}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | \texttt{((} x \texttt{,} y \texttt{),} z \texttt{)} | + | \operatorname{oddly~many~of} |
| \\ | | \\ |
− | &
| + | a, b, c |
| \\ | | \\ |
− | \texttt{(} x \texttt{,(} y \texttt{,} z \texttt{))} | + | \operatorname{are~true}. |
− | \end{matrix}\!</math> | + | \end{matrix}</math> |
| | | | | |
− | <math>\begin{matrix}
| + | <p><math>a + b + c\!</math></p> |
− | \text{Oddly many of}
| |
− | \\
| |
− | x, y, z
| |
− | \\
| |
− | \text{are true}.
| |
− | \end{matrix}\!</math>
| |
− | |
| |
− | <p><math>x + y + z\!</math></p> | |
| <br> | | <br> |
| <p><math>\begin{matrix} | | <p><math>\begin{matrix} |
− | x~y~z~ & \lor
| + | & a ~ b ~ c |
| \\ | | \\ |
− | x~y'z' & \lor
| + | \lor & a ~ \bar{b} ~ \bar{c} |
| \\ | | \\ |
− | x'y~z' & \lor
| + | \lor & \bar{a} ~ b ~ \bar{c} |
| \\ | | \\ |
− | x'y'z~ &
| + | \lor & \bar{a} ~ \bar{b} ~ c |
− | \end{matrix}\!</math></p> | + | \end{matrix}</math></p> |
| |- | | |- |
− | | <math>\texttt{(} w \texttt{,(} x \texttt{),(} y \texttt{),(} z \texttt{))}\!</math> | + | | height="160px" | [[Image:Cactus (X,(A),(B),(C)) Big.jpg|90px]] |
| + | | <math>\texttt{(} x \texttt{,(} a \texttt{),(} b \texttt{),(} c \texttt{))}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | \text{Partition}~ w | + | \operatorname{partition}~ x |
| \\ | | \\ |
− | \text{into}~ x, y, z. | + | \operatorname{into}~ a, b, c. |
− | \\ | + | \\[6pt] |
− | &
| + | \operatorname{genus}~ x ~\operatorname{comprises} |
| \\ | | \\ |
− | \text{Genus}~ w ~\text{comprises} | + | \operatorname{species}~ a, b, c. |
− | \\
| |
− | \text{species}~ x, y, z.
| |
| \end{matrix}</math> | | \end{matrix}</math> |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | w'x'y'z' & \lor
| + | & \bar{x} ~ \bar{a} ~ \bar{b} ~ \bar{c} |
| \\ | | \\ |
− | w~x~y'z' & \lor
| + | \lor & x ~ a ~ \bar{b} ~ \bar{c} |
| \\ | | \\ |
− | w~x'y~z' & \lor
| + | \lor & x ~ \bar{a} ~ b ~ \bar{c} |
| \\ | | \\ |
− | w~x'y'z~ &
| + | \lor & x ~ \bar{a} ~ \bar{b} ~ c |
| \end{matrix}</math> | | \end{matrix}</math> |
| |} | | |} |
Line 763: |
Line 759: |
| A sketch of this work is presented in the following series of Figures, where each logical proposition is expanded over the basic cells <math>uv, u \texttt{(} v \texttt{)}, \texttt{(} u \texttt{)} v, \texttt{(} u \texttt{)(} v \texttt{)}\!</math> of the 2-dimensional universe of discourse <math>U^\bullet = [u, v].\!</math> | | A sketch of this work is presented in the following series of Figures, where each logical proposition is expanded over the basic cells <math>uv, u \texttt{(} v \texttt{)}, \texttt{(} u \texttt{)} v, \texttt{(} u \texttt{)(} v \texttt{)}\!</math> of the 2-dimensional universe of discourse <math>U^\bullet = [u, v].\!</math> |
| | | |
− | ===Computation Summary : <math>f(u, v) = \texttt{((u)(v))}</math>=== | + | ===Computation Summary for Logical Disjunction=== |
| | | |
− | The venn diagram in Figure 1.1 shows how the proposition <math>f = \texttt{((u)(v))}</math> can be expanded over the universe of discourse <math>[u, v]\!</math> to produce a logically equivalent exclusive disjunction, namely, <math>\texttt{uv~+~u(v)~+~(u)v}.</math> | + | The venn diagram in Figure 1.1 shows how the proposition <math>f = \texttt{((} u \texttt{)(} v \texttt{))}\!</math> can be expanded over the universe of discourse <math>[u, v]\!</math> to produce a logically equivalent exclusive disjunction, namely, <math>uv + u \texttt{(} v \texttt{)} + \texttt{(} u \texttt{)} v.\!</math> |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 811: |
Line 807: |
| |} | | |} |
| | | |
− | Figure 1.2 expands <math>\mathrm{E}f = \texttt{((u + du)(v + dv))}</math> over <math>[u, v]\!</math> to give: | + | Figure 1.2 expands <math>\mathrm{E}f = \texttt{((} u + \mathrm{d}u \texttt{)(} v + \mathrm{d}v \texttt{))}\!</math> over <math>[u, v]\!</math> to give: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\texttt{uv~(du~dv) ~+~ u(v)~(du (dv)) ~+~ (u)v~((du) dv) ~+~ (u)(v)~((du)(dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | \mathrm{E}\texttt{((} u \texttt{)(} v \texttt{))} |
| + | & = & uv \cdot \texttt{(} \mathrm{d}u ~ \mathrm{d}v \texttt{)} |
| + | & + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{(} \mathrm{d}v \texttt{))} |
| + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{((} \mathrm{d}u \texttt{)} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
Line 861: |
Line 864: |
| |} | | |} |
| | | |
− | Figure 1.3 expands <math>\mathrm{D}f = f + \mathrm{E}f</math> over <math>[u, v]\!</math> to produce: | + | Figure 1.3 expands <math>\mathrm{D}f = f + \mathrm{E}f\!</math> over <math>[u, v]\!</math> to produce: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\texttt{uv~du~dv ~+~ u(v)~du(dv) ~+~ (u)v~(du)dv ~+~ (u)(v)~((du)(dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | \mathrm{D}\texttt{((} u \texttt{)(} v \texttt{))} |
| + | & = & uv \cdot \mathrm{d}u ~ \mathrm{d}v |
| + | & + & u \texttt{(} v \texttt{)} \cdot \mathrm{d}u \texttt{(} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{)} \mathrm{d}v |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
Line 913: |
Line 923: |
| I'll break this here in case anyone wants to try and do the work for <math>g\!</math> on their own. | | I'll break this here in case anyone wants to try and do the work for <math>g\!</math> on their own. |
| | | |
− | ===Computation Summary : <math>g(u, v) = \texttt{((u,~v))}</math>=== | + | ===Computation Summary for Logical Equality=== |
| | | |
− | The venn diagram in Figure 2.1 shows how the proposition <math>g = \texttt{((u,~v))}</math> can be expanded over the universe of discourse <math>[u, v]\!</math> to produce a logically equivalent exclusive disjunction, namely, <math>\texttt{uv ~+~ (u)(v)}.</math> | + | The venn diagram in Figure 2.1 shows how the proposition <math>g = \texttt{((} u \texttt{,~} v \texttt{))}\!</math> can be expanded over the universe of discourse <math>[u, v]\!</math> to produce a logically equivalent exclusive disjunction, namely, <math>uv + \texttt{(} u \texttt{)(} v \texttt{)}.\!</math> |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 961: |
Line 971: |
| |} | | |} |
| | | |
− | Figure 2.2 expands <math>\mathrm{E}g = \texttt{((u + du,~v + dv))}</math> over <math>[u, v]\!</math> to give: | + | Figure 2.2 expands <math>\mathrm{E}g = \texttt{((} u + \mathrm{d}u \texttt{,~} v + \mathrm{d}v \texttt{))}\!</math> over <math>[u, v]\!</math> to give: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\texttt{uv~((du,~dv)) ~+~ u(v)~(du,~dv) ~+~ (u)v~(du,~dv) ~+~ (u)(v)~((du,~dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | \mathrm{E}\texttt{((} u \texttt{,~} v \texttt{))} |
| + | & = & uv \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))} |
| + | & + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
Line 1,011: |
Line 1,028: |
| |} | | |} |
| | | |
− | Figure 2.3 expands <math>\mathrm{D}g = g + \mathrm{E}g</math> over <math>[u, v]\!</math> to yield the form: | + | Figure 2.3 expands <math>\mathrm{D}g = g + \mathrm{E}g\!</math> over <math>[u, v]\!</math> to yield the form: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\texttt{uv~(du,~dv) ~+~ u(v)~(du,~dv) ~+~ (u)v~(du,~dv) ~+~ (u)(v)~(du,~dv)}\!</math> | + | | |
| + | <math>\begin{matrix} |
| + | \mathrm{D}\texttt{((} u \texttt{,~} v \texttt{))} |
| + | & = & uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
Line 1,082: |
Line 1,106: |
| | | | | |
| <math>\begin{array}{lllll} | | <math>\begin{array}{lllll} |
− | F & = & (f, g) & = & ( ~\texttt{((u)(v))}~ , ~\texttt{((u,~v))}~ ). | + | F |
| + | & = & (f, g) |
| + | & = & ( ~ \texttt{((} u \texttt{)(} v \texttt{))} ~,~ \texttt{((} u \texttt{,~} v \texttt{))} ~ ) |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
Line 1,088: |
Line 1,114: |
| To speed things along, I will skip a mass of motivating discussion and just exhibit the simplest form of a differential <math>\mathrm{d}F\!</math> for the current example of a logical transformation <math>F,\!</math> after which the majority of the easiest questions will have been answered in visually intuitive terms. | | To speed things along, I will skip a mass of motivating discussion and just exhibit the simplest form of a differential <math>\mathrm{d}F\!</math> for the current example of a logical transformation <math>F,\!</math> after which the majority of the easiest questions will have been answered in visually intuitive terms. |
| | | |
− | For <math>F = (f, g)\!</math> we have <math>\mathrm{d}F = (\mathrm{d}f, \mathrm{d}g),</math> and so we can proceed componentwise, patching the pieces back together at the end. | + | For <math>F = (f, g)\!</math> we have <math>\mathrm{d}F = (\mathrm{d}f, \mathrm{d}g),\!</math> and so we can proceed componentwise, patching the pieces back together at the end. |
| | | |
| We have prepared the ground already by computing these terms: | | We have prepared the ground already by computing these terms: |
Line 1,095: |
Line 1,121: |
| | | | | |
| <math>\begin{array}{lll} | | <math>\begin{array}{lll} |
− | \mathrm{E}f & = & \texttt{(( u + du )( v + dv ))} | + | \mathrm{E}f & = & \texttt{((} u + \mathrm{d}u \texttt{)(} v + \mathrm{d}v \texttt{))} |
− | \\ \\ | + | \\[8pt] |
− | \mathrm{E}g & = & \texttt{(( u + du ,~ v + dv ))} | + | \mathrm{E}g & = & \texttt{((} u + \mathrm{d}u \texttt{,~} v + \mathrm{d}v \texttt{))} |
− | \\ \\ | + | \\[8pt] |
− | \mathrm{D}f & = & \texttt{((u)(v)) ~+~ (( u + du )( v + dv ))} | + | \mathrm{D}f & = & \texttt{((} u \texttt{)(} v \texttt{))} ~+~ \texttt{((} u + \mathrm{d}u \texttt{)(} v + \mathrm{d}v \texttt{))} |
− | \\ \\ | + | \\[8pt] |
− | \mathrm{D}g & = & \texttt{((u,~v)) ~+~ (( u + du ,~ v + dv ))} | + | \mathrm{D}g & = & \texttt{((} u \texttt{,~} v \texttt{))} ~+~ \texttt{((} u + \mathrm{d}u \texttt{,~} v + \mathrm{d}v \texttt{))} |
− | \end{array}\!</math> | + | \end{array}</math> |
| |} | | |} |
| | | |
− | As a matter of fact, computing the symmetric differences <math>\mathrm{D}f = f + \mathrm{E}f</math> and <math>\mathrm{D}g = g + \mathrm{E}g</math> has already taken care of the ''localizing'' part of the task by subtracting out the forms of <math>f\!</math> and <math>g\!</math> from the forms of <math>\mathrm{E}f</math> and <math>\mathrm{E}g,</math> respectively. Thus all we have left to do is to decide what linear propositions best approximate the difference maps <math>\mathrm{D}f</math> and <math>\mathrm{D}g,</math> respectively. | + | As a matter of fact, computing the symmetric differences <math>\mathrm{D}f = f + \mathrm{E}f\!</math> and <math>\mathrm{D}g = g + \mathrm{E}g\!</math> has already taken care of the ''localizing'' part of the task by subtracting out the forms of <math>f\!</math> and <math>g\!</math> from the forms of <math>\mathrm{E}f\!</math> and <math>\mathrm{E}g,\!</math> respectively. Thus all we have left to do is to decide what linear propositions best approximate the difference maps <math>{\mathrm{D}f}\!</math> and <math>{\mathrm{D}g},\!</math> respectively. |
| | | |
| This raises the question: What is a linear proposition? | | This raises the question: What is a linear proposition? |
| | | |
− | The answer that makes the most sense in this context is this: A proposition is just a boolean-valued function, so a linear proposition is a linear function into the boolean space <math>\mathbb{B}.</math> | + | The answer that makes the most sense in this context is this: A proposition is just a boolean-valued function, so a linear proposition is a linear function into the boolean space <math>\mathbb{B}.\!</math> |
| | | |
− | In particular, the linear functions that we want will be linear functions in the differential variables <math>du\!</math> and <math>dv.\!</math> | + | In particular, the linear functions that we want will be linear functions in the differential variables <math>\mathrm{d}u\!</math> and <math>\mathrm{d}v.\!</math> |
| | | |
− | As it turns out, there are just four linear propositions in the associated ''differential universe'' <math>\mathrm{d}U^\bullet = [du, dv],</math> and these are the propositions that are commonly denoted: <math>\texttt{0}, \texttt{du}, \texttt{dv}, \texttt{du + dv},</math> in other words, <math>\texttt{()}, \texttt{du}, \texttt{dv}, \texttt{(du, dv)}.</math> | + | As it turns out, there are just four linear propositions in the associated ''differential universe'' <math>\mathrm{d}U^\bullet = [\mathrm{d}u, \mathrm{d}v].\!</math> These are the propositions that are commonly denoted: <math>{0, ~\mathrm{d}u, ~\mathrm{d}v, ~\mathrm{d}u + \mathrm{d}v},\!</math> in other words: <math>\texttt{(~)}, ~\mathrm{d}u, ~\mathrm{d}v, ~\texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}.\!</math> |
| | | |
| ==Notions of Approximation== | | ==Notions of Approximation== |
Line 1,140: |
Line 1,166: |
| Justifying a notion of approximation is a little more involved in general, and especially in these discrete logical spaces, than it would be expedient for people in a hurry to tangle with right now. I will just say that there are ''naive'' or ''obvious'' notions and there are ''sophisticated'' or ''subtle'' notions that we might choose among. The later would engage us in trying to construct proper logical analogues of Lie derivatives, and so let's save that for when we have become subtle or sophisticated or both. Against or toward that day, as you wish, let's begin with an option in plain view. | | Justifying a notion of approximation is a little more involved in general, and especially in these discrete logical spaces, than it would be expedient for people in a hurry to tangle with right now. I will just say that there are ''naive'' or ''obvious'' notions and there are ''sophisticated'' or ''subtle'' notions that we might choose among. The later would engage us in trying to construct proper logical analogues of Lie derivatives, and so let's save that for when we have become subtle or sophisticated or both. Against or toward that day, as you wish, let's begin with an option in plain view. |
| | | |
− | Figure 1.4 illustrates one way of ranging over the cells of the underlying universe <math>U^\bullet = [u, v]\!</math> and selecting at each cell the linear proposition in <math>\mathrm{d}U^\bullet = [du, dv]</math> that best approximates the patch of the difference map <math>\mathrm{D}f</math> that is located there, yielding the following formula for the differential <math>\mathrm{d}f.</math> | + | Figure 1.4 illustrates one way of ranging over the cells of the underlying universe <math>U^\bullet = [u, v]\!</math> and selecting at each cell the linear proposition in <math>\mathrm{d}U^\bullet = [\mathrm{d}u, \mathrm{d}v]\!</math> that best approximates the patch of the difference map <math>{\mathrm{D}f}\!</math> that is located there, yielding the following formula for the differential <math>\mathrm{d}f.\!</math> |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\mathrm{d}f ~=~ \texttt{uv} \cdot \texttt{0} ~+~ \texttt{u(v)} \cdot \texttt{du} ~+~ \texttt{(u)v} \cdot \texttt{dv} ~+~ \texttt{(u)(v)} \cdot \texttt{(du, dv)}</math> | + | | |
| + | <math>\begin{array}{*{11}{c}} |
| + | \mathrm{d}f |
| + | & = & \mathrm{d}\texttt{((} u \texttt{)(} v \texttt{))} |
| + | & = & uv \cdot 0 |
| + | & + & u \texttt{(} v \texttt{)} \cdot \mathrm{d}u |
| + | & + & \texttt{(} u \texttt{)} v \cdot \mathrm{d}v |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | \end{array}</math> |
| |} | | |} |
| | | |
Line 1,190: |
Line 1,224: |
| |} | | |} |
| | | |
− | Figure 2.4 illustrates one way of ranging over the cells of the underlying universe <math>U^\bullet = [u, v]\!</math> and selecting at each cell the linear proposition in <math>\mathrm{d}U^\bullet = [du, dv]</math> that best approximates the patch of the difference map <math>\mathrm{D}g</math> that is located there, yielding the following formula for the differential <math>\mathrm{d}g.\!</math> | + | Figure 2.4 illustrates one way of ranging over the cells of the underlying universe <math>U^\bullet = [u, v]\!</math> and selecting at each cell the linear proposition in <math>\mathrm{d}U^\bullet = [du, dv]\!</math> that best approximates the patch of the difference map <math>\mathrm{D}g\!</math> that is located there, yielding the following formula for the differential <math>\mathrm{d}g.\!</math> |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" style="text-align:center; width:100%" |
− | | <math>\mathrm{d}g ~=~ \texttt{uv} \cdot \texttt{(du, dv)} ~+~ \texttt{u(v)} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)v} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)(v)} \cdot \texttt{(du, dv)}</math> | + | | |
| + | <math>\begin{array}{*{11}{c}} |
| + | \mathrm{d}g |
| + | & = & \mathrm{d}\texttt{((} u \texttt{,} v \texttt{))} |
| + | & = & uv \cdot \texttt{(} \mathrm{d}u \texttt{,} \mathrm{d}v \texttt{)} |
| + | & + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,} \mathrm{d}v \texttt{)} |
| + | \end{array}</math> |
| |} | | |} |
| | | |
Line 1,240: |
Line 1,282: |
| |} | | |} |
| | | |
− | Well, <math>g,\!</math> that was easy, seeing as how <math>\mathrm{D}g</math> is already linear at each locus, <math>\mathrm{d}g = \mathrm{D}g.</math> | + | Well, <math>g,\!</math> that was easy, seeing as how <math>\mathrm{D}g\!</math> is already linear at each locus, <math>\mathrm{d}g = \mathrm{D}g.\!</math> |
| | | |
| ==Analytic Series== | | ==Analytic Series== |
| | | |
− | We have been conducting the differential analysis of the logical transformation <math>F : [u, v] \mapsto [u, v]</math> defined as <math>F : (u, v) \mapsto ( ~\texttt{((u)(v))}~, ~\texttt{((u, v))}~ ),</math> and this means starting with the extended transformation <math>\mathrm{E}F : [u, v, du, dv] \to [u, v, du, dv]</math> and breaking it into an analytic series, <math>\mathrm{E}F = F + \mathrm{d}F + \mathrm{d}^2 F + \ldots,</math> and | + | We have been conducting the differential analysis of the logical transformation <math>F : [u, v] \mapsto [u, v]\!</math> defined as <math>F : (u, v) \mapsto ( ~ \texttt{((} u \texttt{)(} v \texttt{))} ~,~ \texttt{((} u \texttt{,~} v \texttt{))} ~ ),\!</math> and this means starting with the extended transformation <math>\mathrm{E}F : [u, v, \mathrm{d}u, \mathrm{d}v] \to [u, v, \mathrm{d}u, \mathrm{d}v]\!</math> and breaking it into an analytic series, <math>\mathrm{E}F = F + \mathrm{d}F + \mathrm{d}^2 F + \ldots,\!</math> and so on until there is nothing left to analyze any further. |
− | so on until there is nothing left to analyze any further. | |
| | | |
| As a general rule, one proceeds by way of the following stages: | | As a general rule, one proceeds by way of the following stages: |
Line 1,251: |
Line 1,292: |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
| | | | | |
− | <math>\begin{array}{lccccc} | + | <math>\begin{array}{*{6}{l}} |
− | 1. & \mathrm{E}F & = & \mathrm{d}^0 F & + & \mathrm{r}^0 F | + | 1. & \mathrm{E}F |
| + | & = & \mathrm{d}^0 F |
| + | & + & \mathrm{r}^0 F |
| \\ | | \\ |
− | 2. & \mathrm{r}^0 F & = & \mathrm{d}^1 F & + & \mathrm{r}^1 F | + | 2. & \mathrm{r}^0 F |
| + | & = & \mathrm{d}^1 F |
| + | & + & \mathrm{r}^1 F |
| \\ | | \\ |
− | 3. & \mathrm{r}^1 F & = & \mathrm{d}^2 F & + & \mathrm{r}^2 F | + | 3. & \mathrm{r}^1 F |
| + | & = & \mathrm{d}^2 F |
| + | & + & \mathrm{r}^2 F |
| \\ | | \\ |
| 4. & \ldots | | 4. & \ldots |
Line 1,262: |
Line 1,309: |
| |} | | |} |
| | | |
− | In our analysis of the transformation <math>F,\!</math> we carried out Step 1 in the more familiar form <math>\mathrm{E}F = F + \mathrm{D}F,</math> and we have just reached Step 2 in the form <math>\mathrm{D}F = \mathrm{d}F + \mathrm{r}F,</math> where <math>\mathrm{r}F</math> is the residual term that remains for us to examine next. | + | In our analysis of the transformation <math>F,\!</math> we carried out Step 1 in the more familiar form <math>\mathrm{E}F = F + \mathrm{D}F\!</math> and we have just reached Step 2 in the form <math>\mathrm{D}F = \mathrm{d}F + \mathrm{r}F,\!</math> where <math>\mathrm{r}F\!</math> is the residual term that remains for us to examine next. |
| | | |
− | '''NB.''' I'm am trying to give quick overview here, and this forces me to omit many picky details. The picky reader may wish to consult the more detailed presentation of this material at the following locations: | + | '''Note.''' I'm am trying to give quick overview here, and this forces me to omit many picky details. The picky reader may wish to consult the more detailed presentation of this material at the following locations: |
| | | |
− | ; [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]]
| + | :* [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]] |
− | : [[Differential Logic and Dynamic Systems 2.0#The Secant Operator : E|The Secant Operator]] | + | ::* [[Differential Logic and Dynamic Systems 2.0#The Secant Operator : E|The Secant Operator]] |
− | : [[Differential Logic and Dynamic Systems 2.0#Taking Aim at Higher Dimensional Targets|Higher Dimensional Targets]] | + | ::* [[Differential Logic and Dynamic Systems 2.0#Taking Aim at Higher Dimensional Targets|Higher Dimensional Targets]] |
| | | |
| Let's push on with the analysis of the transformation: | | Let's push on with the analysis of the transformation: |
Line 1,275: |
Line 1,322: |
| | | | | |
| <math>\begin{matrix} | | <math>\begin{matrix} |
− | F & : & (u, v) & \mapsto & (f(u, v),~g(u, v)) & = & (~\texttt{((u)(v))}~,~\texttt{((u,~v))}~).\end{matrix}</math> | + | F & : & (u, v) & \mapsto & (f(u, v), g(u, v)) |
| + | & = & ( ~ \texttt{((} u \texttt{)(} v \texttt{))} ~,~ \texttt{((} u \texttt{,~} v \texttt{))} ~) |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
| For ease of comparison and computation, I will collect the Figures that we need for the remainder of the work together on one page. | | For ease of comparison and computation, I will collect the Figures that we need for the remainder of the work together on one page. |
| | | |
− | ===Computation Summary : <math>f(u, v) = \texttt{((u)(v))}</math>=== | + | ===Computation Summary for Logical Disjunction=== |
| | | |
− | Figure 1.1 shows the expansion of <math>f = \texttt{((u)(v))}</math> over <math>[u, v]\!</math> to produce the expression: | + | Figure 1.1 shows the expansion of <math>f = \texttt{((} u \texttt{)(} v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} ~+~ \texttt{u(v)} ~+~ \texttt{(u)v}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv & + & u \texttt{(} v \texttt{)} & + & \texttt{(} u \texttt{)} v |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | Figure 1.2 shows the expansion of <math>\mathrm{E}f = \texttt{((u + du)(v + dv))}</math> over <math>[u, v]\!</math> to produce the expression: | + | Figure 1.2 shows the expansion of <math>\mathrm{E}f = \texttt{((} u + \mathrm{d}u \texttt{)(} v + \mathrm{d}v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} \cdot \texttt{(du~dv)} + \texttt{u(v)} \cdot \texttt{(du (dv))} + \texttt{(u)v} \cdot \texttt{((du) dv)} + \texttt{(u)(v)} \cdot \texttt{((du)(dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv \cdot \texttt{(} \mathrm{d}u ~ \mathrm{d}v \texttt{)} & + & |
| + | u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{(} \mathrm{d}v \texttt{))} & + & |
| + | \texttt{(} u \texttt{)} v \cdot \texttt{((} \mathrm{d}u \texttt{)} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | <math>\mathrm{E}f</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>f\!</math> is true. In this case, where the prevailing proposition <math>f\!</math> is <math>\texttt{((u)(v))},</math> the indication <math>\texttt{uv} \cdot \texttt{(du~dv)}</math> of <math>\mathrm{E}f</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then just don't change both <math>u\!</math> and <math>v\!</math> at once, and you will end up in a place where <math>\texttt{((u)(v))}</math> is true. | + | In general, <math>\mathrm{E}f\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>f\!</math> is true. In this case, where the prevailing proposition <math>f\!</math> is <math>\texttt{((} u \texttt{)(} v \texttt{))},\!</math> the indication <math>uv \cdot \texttt{(} \mathrm{d}u ~ \mathrm{d}v \texttt{)}\!</math> of <math>\mathrm{E}f\!</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then just don't change both <math>u\!</math> and <math>v\!</math> at once, and you will end up in a place where <math>\texttt{((} u \texttt{)(} v \texttt{))}\!</math> is true. |
| | | |
| Figure 1.3 shows the expansion of <math>\mathrm{D}f</math> over <math>[u, v]\!</math> to produce the expression: | | Figure 1.3 shows the expansion of <math>\mathrm{D}f</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} \cdot \texttt{du~dv} ~+~ \texttt{u(v)} \cdot \texttt{du(dv)} ~+~ \texttt{(u)v} \cdot \texttt{(du)dv} ~+~ \texttt{(u)(v)} \cdot \texttt{((du)(dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv \cdot \mathrm{d}u ~ \mathrm{d}v & + & |
| + | u \texttt{(} v \texttt{)} \cdot \mathrm{d}u \texttt{(} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{)} \mathrm{d}v & + & |
| + | \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | <math>\mathrm{D}f</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>f,\!</math> that is, if you want to get to a place where the value of <math>f\!</math> is different from what it is where you are. In the present case, where the reigning proposition <math>f\!</math> is <math>\texttt{((u)(v))},</math> the term <math>\texttt{uv} \cdot \texttt{du~dv}</math> of <math>\mathrm{D}f</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>f\!</math> is different from what it is where you are. | + | In general, <math>{\mathrm{D}f}\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>f,\!</math> that is, if you want to get to a place where the value of <math>f\!</math> is different from what it is where you are. In the present case, where the reigning proposition <math>f\!</math> is <math>\texttt{((} u \texttt{)(} v \texttt{))},\!</math> the term <math>uv \cdot \mathrm{d}u ~ \mathrm{d}v\!</math> of <math>{\mathrm{D}f}\!</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>f\!</math> is different from what it is where you are. |
| | | |
− | Figure 1.4 approximates <math>\mathrm{D}f</math> by the linear form <math>\mathrm{d}f</math> that expands over <math>[u, v]\!</math> as follows: | + | Figure 1.4 approximates <math>{\mathrm{D}f}\!</math> by the linear form <math>\mathrm{d}f\!</math> that expands over <math>[u, v]\!</math> as follows: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
Line 1,310: |
Line 1,374: |
| <math>\begin{matrix} | | <math>\begin{matrix} |
| \mathrm{d}f | | \mathrm{d}f |
− | & = & \texttt{uv} \cdot \texttt{0} & + & \texttt{u(v)} \cdot \texttt{du} & + & \texttt{(u)v} \cdot \texttt{dv} & + & \texttt{(u)(v)} \cdot \texttt{(du, dv)} | + | & = & uv \cdot 0 |
− | \\ \\ | + | & + & u \texttt{(} v \texttt{)} \cdot \mathrm{d}u |
− | & = & & & \texttt{u(v)} \cdot \texttt{du} & + & \texttt{(u)v} \cdot \texttt{dv} & + & \texttt{(u)(v)} \cdot \texttt{(du, dv)} | + | & + & \texttt{(} u \texttt{)} v \cdot \mathrm{d}v |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | \\[8pt] |
| + | & = &&& u \texttt{(} v \texttt{)} \cdot \mathrm{d}u |
| + | & + & \texttt{(} u \texttt{)} v \cdot \mathrm{d}v |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| \end{matrix}</math> | | \end{matrix}</math> |
| |} | | |} |
| | | |
− | Figure 1.5 shows what remains of the difference map <math>\mathrm{D}f</math> when the first order linear contribution <math>\mathrm{d}f</math> is removed, namely: | + | Figure 1.5 shows what remains of the difference map <math>{\mathrm{D}f}\!</math> when the first order linear contribution <math>\mathrm{d}f\!</math> is removed, namely: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
| | | | | |
− | <math>\begin{matrix} | + | <math>\begin{array}{*{9}{l}} |
| \mathrm{r}f | | \mathrm{r}f |
− | & = & \texttt{uv} \cdot \texttt{du~dv} & + & \texttt{u(v)} \cdot \texttt{du~dv} & + & \texttt{(u)v} \cdot \texttt{du~dv} & + & \texttt{(u)(v)} \cdot \texttt{du~dv} | + | & = & uv \cdot \mathrm{d}u ~ \mathrm{d}v |
− | \\ \\ | + | & + & u \texttt{(} v \texttt{)} \cdot \mathrm{d}u ~ \mathrm{d}v |
− | & = & \texttt{du~dv} | + | & + & \texttt{(} u \texttt{)} v \cdot \mathrm{d}u ~ \mathrm{d}v |
− | \end{matrix}</math> | + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \mathrm{d}u ~ \mathrm{d}v |
| + | \\[8pt] |
| + | & = & \mathrm{d}u ~ \mathrm{d}v |
| + | \end{array}</math> |
| |} | | |} |
| | | |
Line 1,371: |
Line 1,443: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,417: |
Line 1,487: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,463: |
Line 1,531: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,509: |
Line 1,575: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,556: |
Line 1,620: |
| |} | | |} |
| | | |
− | ===Computation Summary : <math>g(u, v) = \texttt{((u, v))}</math>=== | + | ===Computation Summary for Logical Equality=== |
| | | |
− | Figure 2.1 shows the expansion of <math>g = \texttt{((u, v))}</math> over <math>[u, v]\!</math> to produce the expression: | + | Figure 2.1 shows the expansion of <math>g = \texttt{((} u \texttt{,~} v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} ~+~ \texttt{(u)(v)}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv & + & \texttt{(} u \texttt{)(} v \texttt{)} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | Figure 2.2 shows the expansion of <math>\mathrm{E}g = \texttt{((u + du, v + dv))}</math> over <math>[u, v]\!</math> to produce the expression: | + | Figure 2.2 shows the expansion of <math>\mathrm{E}g = \texttt{((} u + \mathrm{d}u \texttt{,~} v + \mathrm{d}v \texttt{))}\!</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} \cdot \texttt{((du, dv))} + \texttt{u(v)} \cdot \texttt{(du, dv)} + \texttt{(u)v} \cdot \texttt{(du, dv)} + \texttt{(u)(v)} \cdot \texttt{((du, dv))}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))} & + & |
| + | u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | <math>\mathrm{E}g</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>g\!</math> is true. In this case, where the prevailing proposition <math>g\!</math> is <math>\texttt{((u, v))},</math> the component <math>\texttt{uv} \cdot \texttt{((du, dv))}</math> of <math>\mathrm{E}g</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then change either both or neither of <math>u\!</math> and <math>v\!</math> at the same time, and you will attain a place where <math>\texttt{((du, dv))}</math> is true. | + | In general, <math>\mathrm{E}g\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to end up in a place where <math>g\!</math> is true. In this case, where the prevailing proposition <math>g\!</math> is <math>\texttt{((} u \texttt{,~} v \texttt{))},\!</math> the component <math>uv \cdot \texttt{((} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{))}\!</math> of <math>\mathrm{E}g\!</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then change either both or neither of <math>u\!</math> and <math>v\!</math> at the same time, and you will attain a place where <math>\texttt{((} u \texttt{,~} v \texttt{))}\!</math> is true. |
| | | |
− | Figure 2.3 shows the expansion of <math>\mathrm{D}g</math> over <math>[u, v]\!</math> to produce the expression: | + | Figure 2.3 shows the expansion of <math>\mathrm{D}g\!</math> over <math>[u, v]\!</math> to produce the expression: |
| | | |
− | {| align="center" cellpadding="8" width="90%" | + | {| align="center" cellpadding="8" width="90%" |
− | | <math>\texttt{uv} \cdot \texttt{(du, dv)} ~+~ \texttt{u(v)} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)v} \cdot \texttt{(du, dv)} ~+~ \texttt{(u)(v)} \cdot \texttt{(du, dv)}</math> | + | | |
| + | <math>\begin{matrix} |
| + | uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + & |
| + | u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} & + & |
| + | \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | \end{matrix}</math> |
| |} | | |} |
| | | |
− | <math>\mathrm{D}g</math> tells you what you would have to do, from where you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>g,\!</math> that is, if you want to get to a place where the value of <math>g\!</math> is different from what it is where you are. In the present case, where the ruling proposition <math>g\!</math> is <math>\texttt{((u, v))},</math> the term <math>\texttt{uv} \cdot \texttt{(du, dv)}</math> of <math>\mathrm{D}g</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change one or the other but not both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>g\!</math> is different from what it is where you are. | + | In general, <math>\mathrm{D}g\!</math> tells you what you would have to do, from wherever you are in the universe <math>[u, v],\!</math> if you want to bring about a change in the value of <math>g,\!</math> that is, if you want to get to a place where the value of <math>g\!</math> is different from what it is where you are. In the present case, where the ruling proposition <math>g\!</math> is <math>\texttt{((} u \texttt{,~} v \texttt{))},\!</math> the term <math>uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)}\!</math> of <math>\mathrm{D}g\!</math> tells you this: If <math>u\!</math> and <math>v\!</math> are both true where you are, then you would have to change one or the other but not both <math>u\!</math> and <math>v\!</math> in order to reach a place where the value of <math>g\!</math> is different from what it is where you are. |
| | | |
− | Figure 2.4 approximates <math>\mathrm{D}g</math> by the linear form <math>\mathrm{d}g</math> that expands over <math>[u, v]\!</math> as follows: | + | Figure 2.4 approximates <math>\mathrm{D}g\!</math> by the linear form <math>{\mathrm{d}g}\!</math> that expands over <math>[u, v]\!</math> as follows: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
| | | | | |
− | <math>\begin{array}{lll} | + | <math>\begin{array}{*{9}{l}} |
| \mathrm{d}g | | \mathrm{d}g |
− | & = & \texttt{uv}\!\cdot\!\texttt{(du, dv)} + \texttt{u(v)}\!\cdot\!\texttt{(du, dv)} + \texttt{(u)v}\!\cdot\!\texttt{(du, dv)} + \texttt{(u)(v)}\!\cdot\!\texttt{(du, dv)} | + | & = & uv \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
− | \\ \\ | + | & + & u \texttt{(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
− | & = & \texttt{(du, dv)} | + | & + & \texttt{(} u \texttt{)} v \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| + | \\[8pt] |
| + | & = & \texttt{(} \mathrm{d}u \texttt{,~} \mathrm{d}v \texttt{)} |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
| | | |
− | Figure 2.5 shows what remains of the difference map <math>\mathrm{D}g</math> when the first order linear contribution <math>\mathrm{d}g</math> is removed, namely: | + | Figure 2.5 shows what remains of the difference map <math>\mathrm{D}g\!</math> when the first order linear contribution <math>{\mathrm{d}g}\!</math> is removed, namely: |
| | | |
| {| align="center" cellpadding="8" width="90%" | | {| align="center" cellpadding="8" width="90%" |
| | | | | |
− | <math>\begin{matrix} | + | <math>\begin{array}{*{9}{l}} |
| \mathrm{r}g | | \mathrm{r}g |
− | & = & \texttt{uv} \cdot \texttt{0} & + & \texttt{u(v)} \cdot \texttt{0} & + & \texttt{(u)v} \cdot \texttt{0} & + & \texttt{(u)(v)} \cdot \texttt{0} | + | & = & uv \cdot 0 |
− | \\ \\ | + | & + & u \texttt{(} v \texttt{)} \cdot 0 |
− | & = & \texttt{0} | + | & + & \texttt{(} u \texttt{)} v \cdot 0 |
− | \end{matrix}</math> | + | & + & \texttt{(} u \texttt{)(} v \texttt{)} \cdot 0 |
| + | \\[8pt] |
| + | & = & 0 |
| + | \end{array}</math> |
| |} | | |} |
| | | |
Line 1,647: |
Line 1,732: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,693: |
Line 1,776: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,739: |
Line 1,820: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,785: |
Line 1,864: |
| </pre> | | </pre> |
| |} | | |} |
− |
| |
− | <br>
| |
| | | |
| {| align="center" border="0" cellpadding="10" | | {| align="center" border="0" cellpadding="10" |
Line 1,843: |
Line 1,920: |
| ==Visualization== | | ==Visualization== |
| | | |
− | In my work on [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]], I found it useful to develop several different ways of visualizing logical transformations, indeed, I devised four distinct styles of picture for the job. Thus far in our work on the mapping <math>F : [u, v] \to [u, v],\!</math> we've been making use of what I call the ''areal view'' of the extended universe of discourse, <math>[u, v, du, dv],\!</math> but as the number of dimensions climbs beyond four, it's time to bid this genre adieu, and look for a style that can scale a little better. At any rate, before we proceed any further, let's first assemble the information that we have gathered about <math>F\!</math> from several different angles, and see if it can be fitted into a coherent picture of the transformation <math>F : (u, v) \mapsto ( ~\texttt{((u)(v))}~, ~\texttt{((u, v))}~ ).</math> | + | In my work on [[Differential Logic and Dynamic Systems 2.0|Differential Logic and Dynamic Systems]], I found it useful to develop several different ways of visualizing logical transformations, indeed, I devised four distinct styles of picture for the job. Thus far in our work on the mapping <math>F : [u, v] \to [u, v],\!</math> we've been making use of what I call the ''areal view'' of the extended universe of discourse, <math>[u, v, \mathrm{d}u, \mathrm{d}v],\!</math> but as the number of dimensions climbs beyond four, it's time to bid this genre adieu and look for a style that can scale a little better. At any rate, before we proceed any further, let's first assemble the information that we have gathered about <math>F\!</math> from several different angles, and see if it can be fitted into a coherent picture of the transformation <math>F : (u, v) \mapsto ( ~ \texttt{((} u \texttt{)(} v \texttt{))} ~,~ \texttt{((} u \texttt{,~} v \texttt{))} ~ ).\!</math> |
| | | |
| In our first crack at the transformation <math>F,\!</math> we simply plotted the state transitions and applied the utterly stock technique of calculating the finite differences. | | In our first crack at the transformation <math>F,\!</math> we simply plotted the state transitions and applied the utterly stock technique of calculating the finite differences. |
Line 1,852: |
Line 1,929: |
| | | | | |
| <math>\begin{array}{c|cc|cc|} | | <math>\begin{array}{c|cc|cc|} |
− | t & u & v & du & dv \\[8pt] | + | t & u & v & \mathrm{d}u & \mathrm{d}v \\[8pt] |
− | 0 & 1 & 1 & 0 & 0 \\ | + | 0 & 1 & 1 & 0 & 0 \\ |
− | 1 & '' & '' & '' & '' \\ | + | 1 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
| | | |
− | A quick inspection of the first Table suggests a rule to cover the case when <math>\texttt{u~=~v~=~1},</math> namely, <math>\texttt{du~=~dv~=~0}.</math> To put it another way, the Table characterizes Orbit 1 by means of the data: <math>(u, v, du, dv) = (1, 1, 0, 0).\!</math> Another way to convey the same information is by means of the extended proposition: <math>\texttt{u~v~(du)(dv)}.</math> | + | A quick inspection of the first Table suggests a rule to cover the case when <math>u = v = 1,\!</math> namely, <math>\mathrm{d}u = \mathrm{d}v = 0.\!</math> To put it another way, the Table characterizes Orbit 1 by means of the data: <math>(u, v, \mathrm{d}u, \mathrm{d}v) = (1, 1, 0, 0).\!</math> Another way to convey the same information is by means of the extended proposition: <math>u v \texttt{(} \mathrm{d}u \texttt{)(} \mathrm{d}v \texttt{)}.\!</math> |
| | | |
| {| align="center" cellpadding="8" style="text-align:center" | | {| align="center" cellpadding="8" style="text-align:center" |
Line 1,865: |
Line 1,942: |
| | | | | |
| <math>\begin{array}{c|cc|cc|cc|} | | <math>\begin{array}{c|cc|cc|cc|} |
− | t & u & v & du & dv & d^2 u & d^2 v \\[8pt] | + | t & u & v & \mathrm{d}u & \mathrm{d}v & \mathrm{d}^2 u & \mathrm{d}^2 v \\[8pt] |
− | 0 & 0 & 0 & 0 & 1 & 1 & 0 \\ | + | 0 & 0 & 0 & 0 & 1 & 1 & 0 \\ |
− | 1 & 0 & 1 & 1 & 1 & 1 & 1 \\ | + | 1 & 0 & 1 & 1 & 1 & 1 & 1 \\ |
− | 2 & 1 & 0 & 0 & 0 & 0 & 0 \\ | + | 2 & 1 & 0 & 0 & 0 & 0 & 0 \\ |
− | 3 & '' & '' & '' & '' & '' & '' \\ | + | 3 & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel & {}^\shortparallel |
| \end{array}</math> | | \end{array}</math> |
| |} | | |} |
| | | |
− | A more fine combing of the second Table brings to mind a rule that partly covers the remaining cases, that is, <math>\texttt{du~=~v}, ~\texttt{dv~=~(u)}.</math> This much information about Orbit 2 is also encapsulated by the extended proposition, <math>\texttt{(uv)((du, v))(dv, u)},</math> which says that <math>u\!</math> and <math>v\!</math> are not both true at the same time, while <math>du\!</math> is equal in value to <math>v\!</math> and <math>dv\!</math> is opposite in value to <math>u.\!</math> | + | A more fine combing of the second Table brings to mind a rule that partly covers the remaining cases, that is, <math>\mathrm{d}u = v, ~ \mathrm{d}v = \texttt{(} u \texttt{)}.\!</math> This much information about Orbit 2 is also encapsulated by the extended proposition <math>\texttt{(} uv \texttt{)((} \mathrm{d}u \texttt{,} v \texttt{))(} \mathrm{d}v, u \texttt{)},\!</math> which says that <math>u\!</math> and <math>v\!</math> are not both true at the same time, while <math>\mathrm{d}u\!</math> is equal in value to <math>v\!</math> and <math>\mathrm{d}v\!</math> is opposite in value to <math>u.\!</math> |
| | | |
| ==Turing Machine Example== | | ==Turing Machine Example== |
| + | |
| + | <font size="3">☞</font> See [[Theme One Program]] for documentation of the cactus graph syntax and the propositional modeling program used below. |
| | | |
| By way of providing a simple illustration of Cook's Theorem, namely, that “Propositional Satisfiability is NP-Complete”, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed. | | By way of providing a simple illustration of Cook's Theorem, namely, that “Propositional Satisfiability is NP-Complete”, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed. |
Line 1,891: |
Line 1,970: |
| A turing machine for computing the parity of a bit string is described by means of the following Figure and Table. | | A turing machine for computing the parity of a bit string is described by means of the following Figure and Table. |
| | | |
− | <br>
| + | {| align="center" border="0" cellspacing="10" style="text-align:center; width:100%" |
− | | + | | [[Image:Parity_Machine.jpg|400px]] |
− | {| align="center" border="0" cellpadding="10" | + | |- |
− | |
| + | | height="20px" valign="top" | <math>\text{Figure 3.} ~~ \text{Parity Machine}\!</math> |
− | <pre>
| |
− | o-------------------------------------------------o
| |
− | | | | |
− | | 1/1/+1 | | |
− | | --------> |
| |
− | | /\ / \ /\ | | |
− | | 0/0/+1 ^ 0 1 ^ 0/0/+1 |
| |
− | | \/|\ /|\/ |
| |
− | | | <-------- | |
| |
− | | #/#/-1 | 1/1/+1 | #/#/-1 |
| |
− | | | | |
| |
− | | v v |
| |
− | | # * |
| |
− | | |
| |
− | o-------------------------------------------------o
| |
− | Figure 21-a. Parity Machine | |
− | </pre> | |
| |} | | |} |
| | | |
Line 1,919: |
Line 1,981: |
| | | | | |
| <pre> | | <pre> |
− | Table 21-b. Parity Machine | + | Table 4. Parity Machine |
| o-------o--------o-------------o---------o------------o | | o-------o--------o-------------o---------o------------o |
| | State | Symbol | Next Symbol | Ratchet | Next State | | | | State | Symbol | Next Symbol | Ratchet | Next State | |
Line 2,817: |
Line 2,879: |
| | | |
| The output of <math>\mathrm{Stunt}(2)</math> being the symbol that rests under the tape head <math>\mathrm{H}</math> when and if the machine <math>\mathrm{M}</math> reaches one of its resting states, we get the result that <math>\mathrm{Parity}(1) = 1.</math> | | The output of <math>\mathrm{Stunt}(2)</math> being the symbol that rests under the tape head <math>\mathrm{H}</math> when and if the machine <math>\mathrm{M}</math> reaches one of its resting states, we get the result that <math>\mathrm{Parity}(1) = 1.</math> |
− |
| |
− | ==Work Area==
| |
− |
| |
− | <pre>
| |
− | DATA 20. http://forum.wolframscience.com/showthread.php?postid=791#post791
| |
− |
| |
− | Let's see how this information about the transformation F,
| |
− | arrived at by eyeballing the raw data, comports with what
| |
− | we derived through a more systematic symbolic computation.
| |
− |
| |
− | The results of the various operator actions that we have just
| |
− | computed are summarized in Tables 66-i and 66-ii from my paper,
| |
− | and I have attached these as a text file below.
| |
− |
| |
− | Table 66-i. Computation Summary for f<u, v> = ((u)(v))
| |
− | o--------------------------------------------------------------------------------o
| |
− | | |
| |
− | | !e!f = uv. 1 + u(v). 1 + (u)v. 1 + (u)(v). 0 |
| |
− | | |
| |
− | | Ef = uv. (du dv) + u(v). (du (dv)) + (u)v.((du) dv) + (u)(v).((du)(dv)) |
| |
− | | |
| |
− | | Df = uv. du dv + u(v). du (dv) + (u)v. (du) dv + (u)(v).((du)(dv)) |
| |
− | | |
| |
− | | df = uv. 0 + u(v). du + (u)v. dv + (u)(v). (du, dv) |
| |
− | | |
| |
− | | rf = uv. du dv + u(v). du dv + (u)v. du dv + (u)(v). du dv |
| |
− | | |
| |
− | o--------------------------------------------------------------------------------o
| |
− |
| |
− | Table 66-ii. Computation Summary for g<u, v> = ((u, v))
| |
− | o--------------------------------------------------------------------------------o
| |
− | | |
| |
− | | !e!g = uv. 1 + u(v). 0 + (u)v. 0 + (u)(v). 1 |
| |
− | | |
| |
− | | Eg = uv.((du, dv)) + u(v). (du, dv) + (u)v. (du, dv) + (u)(v).((du, dv)) |
| |
− | | |
| |
− | | Dg = uv. (du, dv) + u(v). (du, dv) + (u)v. (du, dv) + (u)(v). (du, dv) |
| |
− | | |
| |
− | | dg = uv. (du, dv) + u(v). (du, dv) + (u)v. (du, dv) + (u)(v). (du, dv) |
| |
− | | |
| |
− | | rg = uv. 0 + u(v). 0 + (u)v. 0 + (u)(v). 0 |
| |
− | | |
| |
− | o--------------------------------------------------------------------------------o
| |
− |
| |
− |
| |
− | o---------------------------------------o
| |
− | | |
| |
− | | o |
| |
− | | / \ |
| |
− | | / \ |
| |
− | | / \ |
| |
− | | o o |
| |
− | | / \ / \ |
| |
− | | / \ / \ |
| |
− | | / \ / \ |
| |
− | | o o o |
| |
− | | / \ / \ / \ |
| |
− | | / \ / \ / \ |
| |
− | | / \ / \ / \ |
| |
− | | o o o o |
| |
− | | / \ / \ / \ / \ |
| |
− | | / \ / \ / \ / \ |
| |
− | | / \ / \ / \ / \ |
| |
− | | o o o o o |
| |
− | | |\ / \ / \ / \ /| |
| |
− | | | \ / \ / \ / \ / | |
| |
− | | | \ / \ / \ / \ / | |
| |
− | | | o o o o | |
| |
− | | | |\ / \ / \ /| | |
| |
− | | | | \ / \ / \ / | | |
| |
− | | | u | \ / \ / \ / | v | |
| |
− | | o---+---o o o---+---o |
| |
− | | | \ / \ / | |
| |
− | | | \ / \ / | |
| |
− | | | du \ / \ / dv | |
| |
− | | o-------o o-------o |
| |
− | | \ / |
| |
− | | \ / |
| |
− | | \ / |
| |
− | | o |
| |
− | | |
| |
− | o---------------------------------------o
| |
− | </pre>
| |
− |
| |
− | ==Discussion==
| |
− |
| |
− | <pre>
| |
− | PD = Philip Dutton
| |
− |
| |
− | PD: I've been watching your posts.
| |
− |
| |
− | PD: I am not an expert on logic infrastructures but I find the posts
| |
− | interesting (despite not understanding much of it). I am like
| |
− | the diagrams. I have recently been trying to understand CA's
| |
− | using a particular perspective: sinks and sources. I think
| |
− | that all CA's are simply combinations of sinks and sources.
| |
− | How they interact (or intrude into each other's domains)
| |
− | would most likely be a result of the rules (and initial
| |
− | configuration of on or off cells).
| |
− |
| |
− | PD: Anyway, to be short, I "see" diamond shapes quite often in
| |
− | your diagrams. Triangles (either up or down) or diamonds
| |
− | (combination of an up and down triangle) make me think
| |
− | soley of sinks and sources. I think of the diamond to
| |
− | be a source which, during the course of progression,
| |
− | is expanding (because it is producing) and then starts
| |
− | to act as a sink (because it consumes) -- and hence the
| |
− | diamond. I can't stop thinking about sinks and sources in
| |
− | CA's and so I thought I would ask you if there is some way
| |
− | to tie the two worlds together (CA's of sinks and sources
| |
− | together with your differential constructs).
| |
− |
| |
− | PD: Any thoughts?
| |
− |
| |
− | Yes, I'm hoping that there's a lot of stuff analogous to
| |
− | R-world dynamics to be discovered in this B-world variety,
| |
− | indeed, that's kind of why I set out on this investigation --
| |
− | oh, gee, has it been that long? -- I guess about 1989 or so,
| |
− | when I started to see this "differential logic" angle on what
| |
− | I had previously studied in systems theory as the "qualitative
| |
− | approach to differential equations". I think we used to use the
| |
− | words "attractor" and "basin" more often than "sink", but a source
| |
− | is still a source as time goes by, and I do remember using the word
| |
− | "sink" a lot when I was a freshperson in physics, before I got logic.
| |
− |
| |
− | I have spent the last 15 years doing a funny mix of practice in stats
| |
− | and theory in math, but I did read early works by Von Neumann, Burks,
| |
− | Ulam, and later stuff by Holland on CA's. Still, it may be a while
| |
− | before I have re-heated my concrete intuitions about them in the
| |
− | NKS way of thinking.
| |
− |
| |
− | There are some fractal-looking pictures that emerge when
| |
− | I turn to "higher order propositional expressions" (HOPE's).
| |
− | I have discussed this topic elswhere on the web and can look
| |
− | it up now if your are interested, but I am trying to make my
| |
− | e-positions somewhat clearer for the NKS forum than I have
| |
− | tried to do before.
| |
− |
| |
− | But do not hestitate to dialogue all this stuff on the boards,
| |
− | as that's what always seems to work the best. What I've found
| |
− | works best for me, as I can hardly remember what I was writing
| |
− | last month without Google, is to archive a copy at one of the
| |
− | other Google-visible discussion lists that I'm on at present.
| |
− | </pre>
| |
| | | |
| ==Document History== | | ==Document History== |
Line 2,992: |
Line 2,910: |
| * http://forum.wolframscience.com/archive/topic/228-1.html | | * http://forum.wolframscience.com/archive/topic/228-1.html |
| * http://forum.wolframscience.com/showthread.php?threadid=228 | | * http://forum.wolframscience.com/showthread.php?threadid=228 |
− | * http://forum.wolframscience.com/printthread.php?threadid=228&perpage=33 | + | * http://forum.wolframscience.com/printthread.php?threadid=228&perpage=50 |
| # http://forum.wolframscience.com/showthread.php?postid=664#post664 | | # http://forum.wolframscience.com/showthread.php?postid=664#post664 |
| # http://forum.wolframscience.com/showthread.php?postid=666#post666 | | # http://forum.wolframscience.com/showthread.php?postid=666#post666 |