Categories

The two transducers

Transformations

One fundamental notion in Ashby’s cybernetics theory, as described in his book “Introduction to Cybernetics” is that of a transformation. The term is used to describe any particular behaviour of a system. A transformation consists of a number of transitions (changes) on an ordered list of operands with associated results (the transforms). The elements of the transformation can be anything (even other transformations), and in general, there are no explicit requirements for the operands and transforms to be elements of the same domain. In addition, it is important to stress the fact that cybernetics in general is not interested in the underlying physical or other causes for the transformation.

If all the elements from the lower line (transforms) are also listed as operands on the upper line, the transformation is closed. Closure is of particular importance if a transformation is recursively applied when dealing with machine states, because for a non-closed transformation the machine will eventually stop (jam) when the resulting transform is not an operand.

Fig 1. Examples of transformations in Ashby’s notation

If each of the operands on the top is converted to only one transform on the bottom, the transformation is called single‑valued. If all transformations are single-valued and different they are said to be one-one. In the examples in Fig 1, transformation W is closed (all transforms are also operands), it is single valued (each operand has only one transition) but it is not one-one because two operands (p, s) will result with the same transform (q). Transformation R is not closed and can be understood as an “input-output” type of transformation, while for any given operand of U its appropriate transform represents the transformation that will be applied in some (other) coupled system.

Closed, single-valued transformations are interesting because they allow for the recursive application of the transformation on the results of the previous one. In the above examples, WxW on p will result with the transform r, or in a shorter notation W2(p)=r. Obviously this property also permits the coupling of two or more different transformations. If from the transformations above we define the coupling RxWxWxU with input 3 the result is W3, or RW2U(3)=W3. Note that all transformations are applied from left to right (e.g. R(3)=s W(s)=q W(q)=r U(r)=W3) and commutation is not allowed.

Below, in Fig 2 is my attempt to depict an “Ashby machine” going to automatic recursive transformations in a block diagram notation which I believe will be useful in the discussion that follows.

Fig 2. Block diagram of the closed transformation W. The input vector for the transformation W is <p, q, r, s> while the output is <q, r, s>. Block z-1 is a “unit delay” ensuring the output symbol from the previous transformation is applied as the input for the next.

Machine State Changes

Ashby describes complex multi-valued transformations in a matrix notation. In the following example the machine has three different “organizational” states (ways of behaving): A, B and C, that are associated with three closed transformations on a common set of operands (a, b, c and d). If the machine is in state A, the transformation WA will transform a→c but if the machine is in state C, the same machine will apply transformation WC which will now transform a→d.

Fig 3. A matrix  and block representation of a complex multi-valued transformation

A natural question in this moment would be: “What is the mechanism used to change the machine’s behaviour?“. To answer this question we have to give a closer look to two descriptions of a transducer, one given by Ashby  in Chapter 4 of his book and another from Shannon in Part 1.8 of his seminal paper.

Ashby’s Transducer

Ashby’s definition of a transducer (or machine with input, depending on the context) requires a set of closed single-valued transformations that are the canonical representation of the machine’s “laws” of operation, and a variable, the parameter. According to Ashby the behaviour of the (controlled) machine can only be selected from the outside by a controlling machine changing the value of the parameter.

In the following example taken from Ashby’s book pp. 49., two “transducers” (P and R) are joined together in a very simple (unidirectional) manner. When coupled together P controls (dominates) the behaviour of R without receiving any feedback from R. The canonical and block representation of R is given as follows:

Fig 4. Canonical and block representation of a transducer R with parameter

P is defined with a singular closed transformation on three states. In order to couple the two transducers one has to decide which state of R <1, 2, 3> shall be defined by which state of P <i, j, k>. Ashby introduces here a new (coupling) transformation Z which may be “arbitrary and completely under the control of whoever arranges the coupling“. The only requirement is “that the two machines P and R work on a common time-scale, so that their changes keep in step“.

Fig 5. Canonical and block representation of supporting transformations

The complete integrated block diagram of Ashby’s transducer can be then depicted as this:

Fig 6. Ashby’s coupled transducer

From the table of transformations below, it is evident that the coupled system for the initial conditions (b, j) briefly “selects” behaviour R3 in the first step. Immediately after that changes the behaviour to R2 and enters in an oscillatory equilibrium between the states c↔d and i↔k.

Fig 7. First set of transformations for Ashby’s coupled transducer

It is important to note in this example the fact that the behaviour selected by the transformation Z is not applied immediately (in the same step). If we want to keep the whole machine in sync, the behaviour selected in step tn must be applied only after at least a unit delay (in the next step tn+1). This fact is true and shall be more clear when we see Shannon’s description of a transducer in the next section.

Shannon’s transducer

Shannon published his legendary Mathematical Theory of Communication in The Bell System Technical Journal back in 1948, eight years before the release of Asby’s Introduction to Cybernetics. Shannon’s description of the transducer (with memory) is much more simpler than Ashby’s, composed of just two functions with two variables:

yn = f (xn , αn)
αn+1 = g (xn , αn)

where:

  • xn is the nth input symbol,
  • αn is the state of the transducer when the nth input symbol is introduced,
  • yn is the output symbol (or sequence of output symbols) produced when xn is introduced if the state is αn.

Note that both functions f and g can be understood and described as complex multi-valued transformations similar to Ashby’s description from above.

The block diagram for Shannon’s transducer can be then depicted like this:

Fig 8. Block diagram of Shannon’s transducer

If we want to complicate things a little bit further we can add to the simple unit delay another transformation (F) to get a proper “memory function”.

Fig 9. Block diagram of a complete Shannon’s transducer

We can then define the three completely arbitrary functional transformations in a canonical notation as follows:

Fig 10. An arbitrary set of transformations for the transducer depicted in Fig 9.

Note that this selection of variables is just an example to prove a point. As Ashby teach us, the selection can be practically anything as long as it follows the few rules described above.

We can now explore two timelines describing two different behaviours of this transducer:

Fig 11. Two timelines for different input sequences

Note that even if the input in the first timeline is constant (a) the output of the system changes because the state changes. We can interpret this as if the system is “learning” and refining its “knowledge” by revisiting the same set of input data with an “upgraded” knowledge state.

In the second timeline the input sequence is repeating itself but we can see that, because the system is “state defined” and thus dynamical in nature, the state and output do not follow this regularity of the input. Note also that there is nothing “chaotic” in this system. The transformations are completely deterministic. However, even with a such a limited number of variables the system can exhibit some very complex behaviour.

I’m not sure why Ashby never, as far as I know, adopted or even discussed Shannon’s description of the transducer in his enormous contribution to cybernetic theory and instead opted for a complex and cumbersome description from above requiring a parameter and an external control system.

In my opinion, this elegant framework based on Shannon’s definition of a transducer with memory is far reaching and has multiple fundamental repercussions for systems thinking and cybernetics:

  1. If the system is open to the flow of matter and energy but closed to the flow of information (as Ashby teaches us) then the control of the system must be internal, thus the distinction between control and controlled system from classical cybernetics is wrong and should be discarded for complex dynamical systems.
  2. As a consequence of 1., strictly speaking there is no transfer of information between systems either. What is transferred between systems (transducers) are messages, signals, etc. in the form of mater or energy (wave) structures. Information used for encoding a message by the source transducer (transmitter) is then re-created by the receiving transducer (receiver) at the destination and it this is obviously not the same information.
  3. As a consequence of 2., information and knowledge are also internal to the system. Information, once committed to memory (function F ) is used to build the knowledge (state α) of the system which (knowledge) is in turn used to extract new information (function g ) from external data as well as to formulate (control) the output (visible behaviour) of the system (function f ).

And one last (but not least) remark: This framework is so simple and can be very easily scaled and formalized in algorithm form (model) for describing basically any type of dynamical system.