Changes

MyWikiBiz, Author Your Legacy — Friday November 22, 2024
Jump to navigationJump to search
→‎Operational Representation: add text up to next problem area
Line 2,717: Line 2,717:     
To see how a regular representation is constructed from the abstract operation table, select a group element from the top margin of the Table, and &ldquo;consider its effects&rdquo; on each of the group elements as they are listed along the left margin.  We may record these effects as Peirce usually did, as a ''logical aggregate'' of elementary dyadic relatives, that is, as a logical disjunction or boolean sum whose terms represent the ordered pairs of <math>\mathrm{input} : \mathrm{output}\!</math> transactions that are produced by each group element in turn.  This forms one of the two possible ''regular representations'' of the group, in this case the one that is called the ''post-regular representation'' or the ''right regular representation''.  It has long been conventional to organize the terms of this logical aggregate in the form of a matrix:
 
To see how a regular representation is constructed from the abstract operation table, select a group element from the top margin of the Table, and &ldquo;consider its effects&rdquo; on each of the group elements as they are listed along the left margin.  We may record these effects as Peirce usually did, as a ''logical aggregate'' of elementary dyadic relatives, that is, as a logical disjunction or boolean sum whose terms represent the ordered pairs of <math>\mathrm{input} : \mathrm{output}\!</math> transactions that are produced by each group element in turn.  This forms one of the two possible ''regular representations'' of the group, in this case the one that is called the ''post-regular representation'' or the ''right regular representation''.  It has long been conventional to organize the terms of this logical aggregate in the form of a matrix:
 +
 +
Reading &ldquo;<math>+\!</math>&rdquo; as a logical disjunction:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{matrix}
 +
\mathrm{G}
 +
& = & \mathrm{e}
 +
& + & \mathrm{f}
 +
& + & \mathrm{g}
 +
& + & \mathrm{h}
 +
\end{matrix}\!</math>
 +
|}
 +
 +
And so, by expanding effects, we get:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{matrix}
 +
\mathrm{G}
 +
& = & \mathrm{e}:\mathrm{e}
 +
& + & \mathrm{f}:\mathrm{f}
 +
& + & \mathrm{g}:\mathrm{g}
 +
& + & \mathrm{h}:\mathrm{h}
 +
\\[4pt]
 +
& + & \mathrm{e}:\mathrm{f}
 +
& + & \mathrm{f}:\mathrm{e}
 +
& + & \mathrm{g}:\mathrm{h}
 +
& + & \mathrm{h}:\mathrm{g}
 +
\\[4pt]
 +
& + & \mathrm{e}:\mathrm{g}
 +
& + & \mathrm{f}:\mathrm{h}
 +
& + & \mathrm{g}:\mathrm{e}
 +
& + & \mathrm{h}:\mathrm{f}
 +
\\[4pt]
 +
& + & \mathrm{e}:\mathrm{h}
 +
& + & \mathrm{f}:\mathrm{g}
 +
& + & \mathrm{g}:\mathrm{f}
 +
& + & \mathrm{h}:\mathrm{e}
 +
\end{matrix}\!</math>
 +
|}
 +
 +
More on the pragmatic maxim as a representation principle later.
 +
 +
The above-mentioned fact about the regular representations of a group is universally known as Cayley's Theorem, typically stated in the following form:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| Every group is isomorphic to a subgroup of <math>\mathrm{Aut}(X),\!</math> the group of automorphisms of a suitably chosen set <math>X\!</math>.
 +
|}
 +
 +
There is a considerable generalization of these regular representations to a broad class of relational algebraic systems in Peirce's earliest papers.  The crux of the whole idea is this:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| Contemplate the effects of the symbol whose meaning you wish to investigate as they play out on all the stages of conduct where you can imagine that symbol playing a role.
 +
|}
 +
 +
This idea of contextual definition by way of conduct transforming operators is basically the same as Jeremy Bentham's notion of ''paraphrasis'', a &ldquo;method of accounting for fictions by explaining various purported terms away&rdquo; (Quine, in Van Heijenoort, ''From Frege to Gödel'', p.&nbsp;216).  Today we'd call these constructions ''term models''.  This, again, is the big idea behind Schönfinkel's combinators <math>\mathrm{S}, \mathrm{K}, \mathrm{I},\!</math> and hence of lambda calculus, and I reckon you know where that leads.
 +
 +
The next few excursions in this series will provide a scenic tour of various ideas in group theory that will turn out to be of constant guidance in several of the settings that are associated with our topic.
 +
 +
Let me return to Peirce's early papers on the algebra of relatives to pick up the conventions that he used there, and then rewrite my account of regular representations in a way that conforms to those.
 +
 +
Peirce describes the action of an &ldquo;elementary dual relative&rdquo; in this way:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| Elementary simple relatives are connected together in systems of four.  For if <math>\mathrm{A}\!:\!\mathrm{B}\!</math> be taken to denote the elementary relative which multiplied into <math>\mathrm{B}\!</math> gives <math>\mathrm{A},\!</math> then this relation existing as elementary, we have the four elementary relatives
 +
|-
 +
| align="center" | <math>\mathrm{A}\!:\!\mathrm{A} \qquad \mathrm{A}\!:\!\mathrm{B} \qquad \mathrm{B}\!:\!\mathrm{A} \qquad \mathrm{B}\!:\!\mathrm{B}.\!</math>
 +
|-
 +
| C.S. Peirce, ''Collected Papers'', CP&nbsp;3.123.
 +
|}
 +
 +
Peirce is well aware that it is not at all necessary to arrange the elementary relatives of a relation into arrays, matrices, or tables, but when he does so he tends to prefer organizing 2-adic relations in the following manner:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{bmatrix}
 +
a\!:\!a & a\!:\!b & a\!:\!c
 +
\\
 +
b\!:\!a & b\!:\!b & b\!:\!c
 +
\\
 +
c\!:\!a & c\!:\!b & c\!:\!c
 +
\end{bmatrix}\!</math>
 +
|}
 +
 +
For example, given the set <math>X = \{ a, b, c \},\!</math> suppose that we have the 2-adic relative term <math>\mathit{m} = {}^{\backprime\backprime}\, \text{marker for}\, \underline{~ ~ ~}\, {}^{\prime\prime}~\!</math> and the associated 2-adic relation <math>M \subseteq X \times X,\!</math> the general pattern of whose common structure is represented by the following matrix:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>
 +
M \quad = \quad
 +
\begin{bmatrix}
 +
M_{aa}(a\!:\!a) & M_{ab}(a\!:\!b) & M_{ac}(a\!:\!c)
 +
\\
 +
M_{ba}(b\!:\!a) & M_{bb}(b\!:\!b) & M_{bc}(b\!:\!c)
 +
\\
 +
M_{ca}(c\!:\!a) & M_{cb}(c\!:\!b) & M_{cc}(c\!:\!c)
 +
\end{bmatrix}
 +
\!</math>
 +
|}
 +
 +
For at least a little while longer, I will keep explicit the distinction between a ''relative term'' like <math>\mathit{m}\!</math> and a ''relation'' like <math>M \subseteq X \times X,\!</math> but it is best to view both these entities as involving different applications of the same information, and so we could just as easily write the following form:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>
 +
m \quad = \quad
 +
\begin{bmatrix}
 +
m_{aa}(a\!:\!a) & m_{ab}(a\!:\!b) & m_{ac}(a\!:\!c)
 +
\\
 +
m_{ba}(b\!:\!a) & m_{bb}(b\!:\!b) & m_{bc}(b\!:\!c)
 +
\\
 +
m_{ca}(c\!:\!a) & m_{cb}(c\!:\!b) & m_{cc}(c\!:\!c)
 +
\end{bmatrix}
 +
\!</math>
 +
|}
 +
 +
By way of making up a concrete example, let us say that <math>\mathit{m}\!</math> or <math>M\!</math> is given as follows:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{array}{l}
 +
a ~\text{is a marker for}~ a
 +
\\
 +
a ~\text{is a marker for}~ b
 +
\\
 +
b ~\text{is a marker for}~ b
 +
\\
 +
b ~\text{is a marker for}~ c
 +
\\
 +
c ~\text{is a marker for}~ c
 +
\\
 +
c ~\text{is a marker for}~ a
 +
\end{array}\!</math>
 +
|}
 +
 +
In sum, then, the relative term <math>\mathit{m}\!</math> and the relation <math>M\!</math> are both represented by the following matrix:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{bmatrix}
 +
1 \cdot (a\!:\!a) & 1 \cdot (a\!:\!b) & 0 \cdot (a\!:\!c)
 +
\\
 +
0 \cdot (b\!:\!a) & 1 \cdot (b\!:\!b) & 1 \cdot (b\!:\!c)
 +
\\
 +
1 \cdot (c\!:\!a) & 0 \cdot (c\!:\!b) & 1 \cdot (c\!:\!c)
 +
\end{bmatrix}\!</math>
 +
|}
 +
 +
I think this much will serve to fix notation and set up the remainder of the discussion.
 +
 +
In Peirce's time, and even in some circles of mathematics today, the information indicated by the elementary relatives <math>(i\!:\!j),\!</math> as the indices <math>i, j\!</math> range over the universe of discourse, would be referred to as the ''umbral elements'' of the algebraic operation represented by the matrix, though I seem to recall that Peirce preferred to call these terms the &ldquo;ingredients&rdquo;.  When this ordered basis is understood well enough, one will tend to drop any mention of it from the matrix itself, leaving us nothing but these bare bones:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>
 +
M \quad = \quad
 +
\begin{bmatrix}
 +
1 & 1 & 0
 +
\\
 +
0 & 1 & 1
 +
\\
 +
1 & 0 & 1
 +
\end{bmatrix}
 +
\!</math>
 +
|}
 +
 +
The various representations of <math>M\!</math> are nothing more than alternative ways of specifying its basic ingredients, namely, the following aggregate of elementary relatives:
 +
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<math>\begin{array}{*{13}{c}}
 +
M
 +
& = & a\!:\!a
 +
& + & b\!:\!b
 +
& + & c\!:\!c
 +
& + & a\!:\!b
 +
& + & b\!:\!c
 +
& + & c\!:\!a
 +
\end{array}\!</math>
 +
|}
 +
 +
Recognizing that <math>a\!:\!a + b\!:\!b + c\!:\!c\!</math> is the identity transformation otherwise known as <math>\mathit{1},\!</math> the 2-adic relative term <math>m = {}^{\backprime\backprime}\, \text{marker for}\, \underline{~ ~ ~}\, {}^{\prime\prime}~\!</math> can be parsed as an element <math>\mathit{1} + a\!:\!b + b\!:\!c + c\!:\!a\!</math> of the so-called ''group ring'', all of which makes this element just a special sort of linear transformation.
    
==Logical Cacti==
 
==Logical Cacti==
12,080

edits

Navigation menu