There is no Meaning - Only Structure

 -A Personal  Manifesto:

Trevor Batten


1.0 Some Basic Underlying Concepts:

Seemingly  identical words often appear to have different meanings in different contexts. It is therefore important to define the interpretation of basic concepts (used both by the average reader and specifically the author) before discussing these concepts.

Through this  process of definition  we may also discover that the definitions are not simply a starting point for theory forming -but actually delineate a form of "theory" in their own right.

1.1  Hypothesis and Verification

A hypothesis (plural hypotheses) is a proposed explanation for a phenomenon. For a hypothesis to be a scientific hypothesis, the scientific method requires that one can test it. Scientists generally base scientific hypotheses on previous observations that cannot satisfactorily be explained with the available scientific theories......

......In its ancient usage, hypothesis referred to a summary of the plot of a classical drama. The English word hypothesis comes from the ancient Greek word ὑπόθεσις hypothesis whose literal or etymological sense is "putting or placing under" and hence in extended use has many other meanings including "supposition."

..... Experimenters may test and reject several hypotheses before solving the problem. (1)  <https://en.wikipedia.org/wiki/Hypothesis> (Accessed May 10, 2021)

Verification is an extra or final bit of proof that establishes something is true.
(2)
  <https://www.vocabulary.com/dictionary/verification> (Accessed May 10, 2021)

However, the concept of "truth" is itself rather problematic.

Truth is the property of being in accord with fact or reality. In everyday language, truth is typically ascribed to things that aim to represent reality or otherwise correspond to it, such as beliefs, propositions, and declarative sentences.

Truth is usually held to be the opposite of falsehood. The concept of truth is discussed and debated in various contexts, including philosophy, art, theology, and science. Most human activities depend upon the concept, where its nature as a concept is assumed rather than being a subject of discussion; these include most of the sciences, law, journalism, and everyday life. Some philosophers view the concept of truth as basic, and unable to be explained in any terms that are more easily understood than the concept of truth itself. Most commonly, truth is viewed as the correspondence of language or thought to a mind-independent world. This is called the correspondence theory of truth. (3)  <https://en.wikipedia.org/wiki/Truth> (Accessed May 10, 2021)

Correspondence theory is a traditional model which goes back at least to some of the ancient Greek philosophers such as Plato and Aristotle.  This class of theories holds that the truth or the falsity of a representation is determined solely by how it relates to a reality; that is, by whether it accurately describes that reality.
(4) <https://en.wikipedia.org/wiki/Correspondence_theory_of_truth> (Accessed May 10, 2021)

Generally, where one can identify any class of object, the existence or essential characteristics of which is said not to depend on perceptions, beliefs, language, or any other human artifact, one can speak of "realism about" that object.
 
(5)< https://en.wikipedia.org/wiki/Reality> (Accessed May 10, 2021)

Surely, by the time one has removed all dependencies on "perceptions, beliefs, language, or any other human artifact" the relevance to humans of the Reality of the object concerned must be questionable.

To say nothing of the tautological nature of the logic which appears to see reality as truth and truth as reality.

Surely, in practice, "Reality" is not something "outside" human experience -but simply a correlation (which allows a mapping to be made) between different aspects of the human experience itself.

Simply put: If we believe that it will rain at 3 pm -and at the moment that our clock tells us that it is 3 pm we get wet when we go outside -then we can conclude that  our belief was true (has been verified) and the "reality" is that it rained at 3 pm..... Additionally, we can also assume that any hypothesis that predicted the rain at that time has been validated (at least within this specific context).

Although "Verification" seems to refer to concepts such as “truth and “reality -it can operate without them -if we consider "verification" merely as a confirmation of a relationship between certain phenomena.

Under these conditions, a Hypothesis is a kind of "simulation" (explanation/model) of an experienced phenomenon -and Verification implies a congruence between our model and our (experimental) experiences of  daily life.

However, the discovery of congruence should not imply a "truth" or "reality". It is merely a correlation between different sets of experience -and the precise nature of that correlation is likely to be impossible to verify for all cases, for all people, in all places and for all times..... It is the later discovery of an in-congruence (between our model and our experience) that drives us to refine our hypothesis and make further inquiry into the phenomenon in question.

1.2  Systems and Languages

A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment, is described by its boundaries, structure and purpose and expressed in its functioning. Systems are the subjects of study of systems theory..... ...... The term "system" comes from the Latin word systēma, in turn from Greek σύστημα systēma: "whole concept made of several parts or members, system", literary "composition"...... ......A subsystem is a set of elements, which is a system itself, and a component of a larger system. (6)  <https://en.wikipedia.org/wiki/System> (Accessed May 10, 2021)

A "system" is merely a collection of parts which act together as a coherent whole.

Formal System: Noun: logic  an uninterpreted symbolic system whose syntax is precisely defined, and on which a relation of deductibility is defined in purely syntactic terms; a logistic system Also called: formal theory, formal calculus Compare formal language (7) <https://www.dictionary.com/browse/formal-system> (Accessed May 10, 2021)

formal language: Noun: A language designed for use in situations in which natural language is unsuitable, as for example in mathematics, logic, or computer programming. The symbols and formulas of such languages stand in precisely specified syntactic and semantic relations to one another logic, a logistic system for which an interpretation is provided: distinguished from formal calculus in that the semantics enable it to be regarded as about some subject matter.
 (8)
<https://www.dictionary.com/browse/formal-language> (Accessed May 10, 2021)

So, basically a "system" is uninterpreted -while a "language" has an interpretation which allows it to refer to something outside itself. A "system" stands for itself -while a "language" enables us to create a representation (model) of something in order to study it better.

In general, a model is an informative representation of an object, person or system. The term originally denoted the plans of a building in late 16th-century English, and derived via French and Italian ultimately from Latin modulus, a measure.
 (9) <
https://en.wikipedia.org/wiki/Model> (Accessed May 10, 2021)

On one level, a "language" can be seen as providing a "medium" for a model. At the moment we can define "medium" as a (physical) intermediary between us and some idea or concept -rather similar to the way  a "language" is also a form of intermediary.

However, if for the sake of simplicity of argument, we can (at least temporarily) accept that the main difference between a system and a language is (by definition) that a system refers to itself -while a language references something else.

Then, logically, something must either be a system or a language -depending on what it refers to. Unless we have an infinite chain of languages, the chain must always bottom out as a system -which refers to itself (perhaps including, in some way, the chain of languages).

Again, logically, an infinite chain of languages, literally refers to nothing and so must be considered "meaningless".

A "system" is a collection of components plus a set of relationships between the components. So any chain of languages must bottom out to a set of components and relationships.... (a structure in static terms, or a system in dynamic terms)

A detailed discussion on media is outside the scope of this text. Although it has been discussed in more detail in these essays" What is Media Art" (10)
<http://www.tebatt.net/SAT/CONTEXTS/ART/MediaArt.html> (Accessed May 10, 2021)
and "
On the Stairs: Between Old and New Media." (11)  < http://www.tebatt.net/SAT/COGITATIONS/DEFINITIONS/OnTheStairs.html> (Accessed May 10, 2021).

1.3  Formal and Informal

Often it seems that emotionally the concepts "formal" and "informal" are like fire and water. Most people have a marked preference for one or the other -but hardly ever do people seem to exhibit an appreciation of both.

A formal system is used for inferring theorems from axioms according to a set of rules. These rules, which are used for carrying out the inference of theorems from axioms, are the logical calculus of the formal system. A formal system is essentially an "axiomatic system".

In 1921, David Hilbert proposed to use such a system as the foundation for the knowledge in mathematics. A formal system may represent a well-defined system of abstract thought....... ......Each formal system uses primitive symbols (which collectively form an alphabet) to finitely construct a formal language from a set of axioms through inferential rules of formation.

The system thus consists of valid formulas built up through finite combinations of the primitive symbols—combinations that are formed from the axioms in accordance with the stated rules.

More formally, this can be expressed as the following:

  1. A finite set of symbols, known as the alphabet, which concatenate formulas, so that a formula is just a finite string of symbols taken from the alphabet.
  2. A grammar consisting of rules to form formulas from simpler formulas. A formula is said to be well-formed if it can be formed using the rules of the formal grammar. It is often required that there be a decision procedure for deciding whether a formula is well-formed.
  3. A set of axioms, or axiom schemata, consisting of well-formed formulas.
  4. A set of inference rules. A well-formed formula that can be inferred from the axioms is known as a theorem of the formal system. (12)
    <https://en.wikipedia.org/wiki/Formal_system> (Accessed May 11, 2021)

Notice that the formal system (although uninterpreted) also encompasses a system for generating compound formulas from more primitive ones -and inference rules, if not, interpretation rules. The similarity with a "language" seems obvious with regard to this generative aspect.

Informal systems do not follow any formal or pre established rules for collecting, processing, storing, or disseminating data.  (13) <https://the-definition.com/term/informal-system> (Accessed May 11, 2021)

It is also important to note that, just because the rules for collecting, processing, storing, or disseminating data are not formally stated -one should not assume that these rules  do not exist.

Informal Structure

Natural theorists question the importance of formal structures over informal ones. "Informal structures are those based on the characteristics or resources of the specific participants" (Scott p. 54), and can be distinguished from formal basis by observing the changes resulting from a change in personnel at a particular position.

"Individual participants ... enter the organization with individually shaped ideas, expectations, and agendas, and they bring with them differing values, interests, and abilities". (Scott p. 54)

Yet interestingly, this informal structure is also stable. "Participants within formal organizations generate informal norms and behavior patterns: status and power systems, communication networks, sociometric structures, and working arrangements" (Scott p. 54).

These informal systems are necessary, because no one can devise a formal system that can function under all possible contingencies or remain adaptive with change.

"In sum, natural system theorists insist that highly centralized and formalized structures are doomed to be ineffective and irrational in that they waste the organization's most precious resource: the intelligence and initiative of its participants". Scott p. 55

Early natural theorists tended to overlook the impact of the environment on organization structure and behavior. (14)<https://faculty.babson.edu/krollag/org_site/encyclop/inform_struct.html> (Accessed May 11, 2021)

Despite the apparent antagonism between supporters of formal and informal systems (of organization)  there may be some important similarities and differences which may be worth consideration:

a. Rule Based Nature

Without implicit or explicit rules there can be no "organization" (or system). It is the nature of these rules (either explicit or implicit) that determine the characteristics of the system: Thus determining either its formal or informal nature -as well as any other distinguishing features it may have.

b. Explicit Structure

An essential quality of a formal system is that the rules of the system (or language) are explicitly stated. However, one can argue over the exact meaning of "explicit": Generally, this means that the rules are written down in some form which enables anybody using the system to be able to follow exactly the same set of rules. Precise reproduction over a wide range of implementations by different implementing systems is an important side-effect (or perhaps the main aim) of any formal system. It is this ability to reproduce exactly the results of the system that gives it its reputation for being "Objective" (and by implication real or true). However, it is important to note that in this context “Objective” simply means the system is perceived in identical ways by a wide audience -it does not actually imply a truth value.

However, any physically existing system can be observed and studied in order to understand its (behavioral) characteristics.  So, one could argue that any physical system -by virtue of its physical existence explicitly manifests its natural characteristics -even if these characteristics are not explicitly stated in words or symbols.

Under these conditions, I believe one should consider all physically manifest systems as being at least an intermediary form of "formal system" (i.e. a semi-formal system)..

This is not to claim that all physical manifestations are "formal systems" -but to claim that there is not a simple binary division between "formal" and 'informal" -because the difference between them is more gradual and hierarchical, depending to the degree to which the (internal) rules of operation are explicitly available to external observers -or implicitly available if not explicit.

This does imply a rather large difference between physically manifest systems and purely mental constructs which remain invisible in the mind and have no physical expression. This difference between physical and mental manifestation is possibly larger than the difference between "formal” and "informal" systems.

The importance of rules being explicit cannot be exaggerated. An explicit rule can be discussed and possibly changed -but covert rules can only be implemented without discussion (because their true nature is unknown). Covert rules therefore appear arbitrary and unfair (and indeed often encourage unjust or incorrect application).

Presumably, physical systems (or objects) are manifestations of the rules that created them -even though these rules are not explicitly stated. It may therefore be difficult (depending on the specific circumstance) to deduce the rules from the manifestation but perhaps not impossible. Mental constructs, on the other hand, are human inventions -so one assumes that at least one human (the inventor) is aware of the conditions of construction or operation. If the inventor shares the rules with others then it is probable that anybody applying these rules correctly will get the same result as anybody else. So, humanly constructed systems, when provided with explicit rules, may appear (if everybody accepts the same rule based logic) to be more objective than systems where the rules are not (or cannot be) known.

c. Hierarchy of Components and Subsystems

The more components and possible connections between those components a system exhibits, the more difficult it becomes to structure it effectively and efficiently.

This is true even in nature: Our bodies, for example, are not an undifferentiated mass of cells -but are organized in a hierarchical system of organs and other bits and pieces. This internal hierarchical structure also allows "specialization" as some groups of cells are then able to develop specific and specialized forms and functions that collectively contribute to the survival of the entire system: Even if these subsystems are unsustainable on their own, if taken out of the context of the body.

In the view of the author, these three factors should be considered equally important by those who prefer dealing with informal systems as those who prefer dealing with formal systems.

How could it be possible, for example, to correct any undesirable phenomena in daily life, if we had no understanding of the way daily life is organized: We especially need to know how the sub-section where our apparent "problem" is located relates to other sub-systems in the system -so that we can understand how apparently positive changes to one part of the system might have adverse effects on other parts.

Nevertheless, the "Two Cultures Problem" seems to be remarkably difficult to resolve.

"The Two Cultures" is the first part of an influential 1959 Rede Lecture by British scientist and novelist C. P. Snow which was published in book form as The Two Cultures and the Scientific Revolution the same year. Its thesis was that science and the humanities which represented "the intellectual life of the whole of western society" had become split into "two cultures" and that this division was a major handicap to both in solving the world's problems. (15) <https://en.wikipedia.org/wiki/The_Two_Cultures> (Accessed May 11, 2021)

One may surmise that the two culture problem is related to a fondness or an aversion to the clarity of formal systems.  Perhaps one could heal the cultural rift by appreciating the similarities between formal and informal systems -rather than focusing on their differences. However, a detailed discussion of this is beyond the scope of this paper.

1.4  Axioms, Theorems, Definitions and Paradigms

"An axiom is a proposition in mathematics and epistemology that is taken to be self-evident or is chosen as a starting point of a theory." (16) <https://en.wikipedia.org/wiki/Axiom_(disambiguation)> (Accessed May 11, 2021)

Any axiom is a statement that serves as a starting point from which other statements are logically derived. Whether it is meaningful (and, if so, what it means) for an axiom to be "true" is a subject of debate in the philosophy of mathematics. .....

......Axioms play a key role not only in mathematics but also in other sciences, notably in theoretical physics. In particular, the monumental work of Isaac Newton is essentially based on Euclid's axioms, augmented by a postulate on the non-relation of space-time and the physics taking place in it at any moment.

In 1905, Newton's axioms were replaced by those of Albert Einstein's special relativity, and later on by those of general relativity......

...... Regardless, the role of axioms in mathematics and in the above-mentioned sciences is different. In mathematics one neither "proves" nor "disproves" an axiom for a set of theorems; the point is simply that in the conceptual realm identified by the axioms, the theorems logically follow. In contrast, in physics, a comparison with experiments always makes sense, since a falsified physical theory needs modification. (17) <https://en.wikipedia.org/wiki/Axiom> (Accessed May 11, 2021)

In mathematics and logic, a theorem is a non-self-evident statement that has been proven to be true, either on the basis of generally accepted statements such as axioms or on the basis of previously established statements such as other theorems. A theorem is hence a logical consequence of the axioms....

..... Because theorems lie at the core of mathematics, they are also central to its aesthetics. Theorems are often described as being "trivial", or "difficult", or "deep", or even "beautiful". These subjective judgments vary not only from person to person, but also with time and culture: for example, as a proof is obtained, simplified or better understood, a theorem that was once difficult may become trivial. On the other hand, a deep theorem may be stated simply, but its proof may involve surprising and subtle connections between disparate areas of mathematics. Fermat's Last Theorem is a particularly well-known example of such a theorem. (18) <https://en.wikipedia.org/wiki/Theorem> (Accessed May 11, 2021)

Definition of definition

A definition is a statement of the meaning of a term. (19) <https://en.wikipedia.org/wiki/Definition_(disambiguation)> (Accessed May 11, 2021)

1b : a statement expressing the essential nature of something.
(20)
<https://www.merriam-webster.com/dictionary/definition> (Accessed May 11, 2021)

A paradigm is a standard, perspective, or set of ideas. A paradigm is a way of looking at something.

The word paradigm comes up a lot in the academic, scientific, and business worlds. A new paradigm in business could mean a new way of reaching customers and making money. In education, relying on lectures is a paradigm: if you suddenly shifted to all group work, that would be a new paradigm. When you change paradigms, you're changing how you think about something.
 (21)
< https://www.vocabulary.com/dictionary/paradigm> (Accessed May 11, 2021)

Ideas take Shape in Space:

In mathematics and Logic, Axioms and Theorems seem to be inextricably linked to each other. The Theorems seem to be simply the logically inevitable effect of the nexus of axioms used to define the system.

For example, let us pretend that y = x is an axiomatic assumption of a simple system. If we plotted the  graph of this for the range of x = 1 to 100 -then we would plot a diagonal line..... If, on the other hand, we took x + y = 0  as our defining axiom -then we would plot a circle under the same conditions as the previous plot.

This suggests to the author that axioms and theorems together form a kind of "space" -the characteristics (shape) of which are determined entirely by the axiomatic assumptions involved.

This is obviously literally so in the case of mathematics and mathematical formulas -but perhaps it is also (metaphorically) true in daily life: If we assume our neighbour is an idiot -or is evil -then, presumably, their reaction to us is likely to be shaped by our assumptions regarding them.

For the sake of brevity I have omitted any detailed exploration of the possible relationships between such concepts as “paradigm”, “axiom”, “assumption”, “definition”, “model”, “medium”, “language”, ``transformation”, ''process”, ''object”, ''state”, ''space” and "system". However, this is certainly a topic that requires further exploration.

Indeed, I believe that at some point it will be essential to be able to understand these various conceptual modelling terminologies and technologies in terms of each other -and to be able to translate from one to another.

The concept of "space" has already been addressed by the author in  "What is Space"
(22)
<http://www.tebatt.net/SAT/Space-Lecture/Space.01.html> (Accessed May 11, 2021) and "Some Personal Remarks on the Creative Potential of Space (Towards an Einsteinian Turing machine)."
(23)
<http://www.tebatt.net/SAT/CONTEXTS/ART/CircusSpace.html> (Accessed May 11, 2021)


 However, it is also intended to be discussed further in future: At present it is important to note that (
in this context) the 'axiomatic space" has a set of characteristics (including "shape") that are determined (entirely?) as a consequence of our basic assumptions. If we change our basic assumptions (axioms) then the space (and its characteristic distinguishing theorems)  also change as a consequence of the change at the level of the axioms.

Shaped Spaces:

 

In order to understand such concepts as "Mapping" and "Multi-Dimensional Conceptual Space" fully, we may also need to see "Space" not as an empty volume -but as a multidimensional set of ordered locations (organized in terms of the dimensions, or parameters of the space). These "locations" can be seen as addressable (labeled) points through which something might travel -or as places where something might be stored. In this context, one might view “space” as an “ordered set” -where the organization of the space determines the order of the elements within the set (or dimension if viewed as a space).

The way these locations are related to each other, and the locations one needs to pass through in order to connect one location to another, determine the geometry and topology of the space; i.e. the shape of the space involved.

The concept of “space” is useful in a creative context because (like other forms of formal systems) it allows arbitrary assumptions to be represented in a way that is accessible to other people in addition to the person authoring it.

Navigating Computation

Originally, "computers" were people (mostly women) who performed the (almost) mechanical labour of computing such socially essential things as the mathematical tables used in ships navigation or doing the grunt work in scientific astronomical calculations.

Fundamentally, "computation" is the rule-based mapping of a set of input values into a set of output values. If we consider our current  location within a certain space to be an “input value” -and where we would like to be as an ”output value” -then it is the computational mapping between the two that determines our movement through that space.

Obviously, the specific computational process itself determines the relationship between input and output in every case.

The Syntactic Triangle:

Although perhaps not obvious to the non-mathematician, the nature of the computation is defined by the relationship between the  elements of the output set -in relation to the input set.

For example, "adding 10" would shift all the numbers in the output set 10 places along the consecutive number string. However the distance between the individual numbers in both input and output set would be the same as before the mapping. Subtraction would shift them all in the other direction (leaving them just as equally spaced). Multiplication spreads the numbers out -while division would bring them all back closer together.

In that sense -the computational process treats numbers rather like a sheep dog herding sheep.

Because computation (and in fact most processes) establish a (usually) recognizable pattern of transformation between input and output sets -it is (usually) possible to deduce a process from comparing input set with output set.

Similarly, the output can (usually) be predicted if one knows the original input and the proposed process -or the input can be deduced if one knows the process and the output.

This "information" (the difference that makes a difference) is what usually helps us to understand the world around us -because not only can we predict outcomes, we can also deduce the process behind any change or imagine the original state before the change  occurred -as long as we know any two of the input, output, process trilogy.

For completeness -we must also note that some  processes are fully describable -but not predictable in the above described terms. This is particularly so when ambiguity is present -i.e. instead of every unique input pattern having a unique output pattern - several inputs may produce the same output (or different processes produce the same output, etc.).

Syntax and Intelligence:

Clearly, when a machine can take on a task which previously required a human to undertake -there has been a transfer of "skill" from the human to the machine. Performance of that task is no longer dependent on the skill of an individual human -but on the design and construction of the machine.

Formal (explicit) skills and processes embedded in a machine are easier to reproduce than the "informal" (covert) skills of a human.

Obviously, the mechanization of human skills has great economic advantages -because it allows (almost) anybody to perform actions that previously would have been limited to a talented few.

On the other hand, the downside is that the human ability to perform that task is likely to become atrophied -if the human performer has become redundant.

Atrophying intellectual skills can be a serious social problem if automation continues to encompass cognitive tasks previously reserved for humans. The more reliant we become on machines, the more likely we are to lose the skills that enable us to test and control those machines.

The GUI:

The introduction of the Graphical User Interface revolutionized computing (and computer marketing) because it made (a more limited) control of the machine accessible to those with absolutely no understanding of the machine itself -especially  in cases where users were unwilling  to understand the machine they were becoming so dependent on.

When the GUI was introduced, some computer experts were worried by the future prospect of having an army of "dumb" people operating "clever" machines....

The proponents of the GUI claimed that this would never happen -but instead we would end up with intelligent humans operating clever machines  (instead of clever people operating dumb machines as at the time).....

I'm afraid it is the contention of the author that practical experience over time has proven the GUI supporters to be completely wrong.

The Universal Simulation Machine:

Because the Turing Machine (which is seen as a basic model for the  computational process) presented a very generalized and abstract model of the computational model, it was considered to be a universal simulation machine that could simulate any process that could be described in computational terms.  

This opens up the possibility of extending the concept of computability into the realms of Artificial Intelligence -or to any other area in which a simulation might be useful -as a  practical training tool, an investigative technique or even as an economical substitute for human mental labour.

Translating from Hardware to Software:

One of the great mysteries of human cognition is how does one bridge the gap between the biological neurons of the brain and a mental Theory of Relativity (or any great mental achievement).

Obviously, the physical "hardware" of the human brain is different in many ways from the hardware of an electronic computer. However, surely the basic problem is the same: How does one translate a set of low level and ignorant "bits" into an apparently intelligently acting "mental system"?

Surely, understanding this process in an electronic machine will help us  to clarify how the process operates in a biological (neurological) machine -at least the principles involved are similar, even if the actual components and mechanisms involved are, obviously different.

If the computer is truly a universal simulation machine -then it should be able to simulate the human brain (in various ways).

Obviously, the prime test of any theory is that the subject of the theory actually behaves in ways that can be predicted and controlled by the theory. However, for various reasons -some experiments cannot be ethically made -and so the theory cannot be proved -except by accepted logical principles being applied to an accepted set of assumptions. But how do we test this logic and these assumptions?

Presumably, practical "simulation" in such cases, is a much better way of demonstrating the effectiveness of a theoretical model than a "thought model" which has no physical manifestation and can only be tested within the limitations of the human mind.

Computers are physical machines that not only compute static answers -they are also ideal laboratories for testing ideas and theories through simulation.

Indeed, perhaps the human brain also understands the world around it largely through its own attempts to simulate its environment. How else should we see our conceptual models of the world?


Dynamic Spaces

Various instructions and data are stored within a Turing Machine in labeled locations in  the machine’s memory system: Computation is then performed by reading the instruction in a specified location, retrieving the required data from its specified storage space and after performing the previously specified instruction -storing the result in some other labelled location.

Under these conditions, a (memory) space becomes an active computational system, which can modify the content of its own memory. As mentioned earlier: “Computation” is a process that maps a set of input data into a set of output data (possibly but not necessarily,  in the same location within the machine's memory system).


Spacetime:

Pedagogical representations of Einstein's spacetime often use the image of a ball bearing rolling over a rubber sheet to illustrate how spacetime changes when objects move through it.

To the author, there is an obvious similarity between a ball bearing distorting a rubber sheet as it rolls over it and the modification of the memory space as the instruction pointer navigates through it.

"Spacetime" may be popularly  seen as an exotic and bizarre concept of theoretical physics -but it might also prove to be a basic principle concerning the way our daily lives function and evolve -if we can perhaps just see ourselves and our environment as "Computational Systems" operating in spacetime.....

1.5  Provisional Hypotheses regarding Interpretation:

i. Verification through correlation  (mapping) is philosophically preferable to a belief in absolute truth -because it avoids the apparently impossible task of defining 'reality" in any absolute sense.

ii. By the definitions given earlier, a system is a "thing in itself" -while a "language" is a system that either acts as a 'model' for something else or enables us to generate such models. Until we can prove otherwise -we shall assume that whatever is being modeled by the "language' is, in fact, itself a system. So there is little conceptual difference between that which is modeled and that which models it. Theoretically,  they are even interchangeable but in practice, unless they are identical, each “system” will have characteristics that are not present in the other -and where one set of characteristics is a super-set of the other (i.e  one set contains the other, or overlaps the other) it maybe more convenient (in practical terms) to consider one as the model and the other as that which is being modeled.... In some cases considerations of danger or impossibility of manipulation (for whatever reason) may make one system more amenable to be the “model” than the other. Nevertheless, for theoretical purposes it may be advantageous to see both the model and that which is modeled as “systems” (things in themselves) with an arbitrary assignment as to which is which: Otherwise one gets involved in complex and unsolvable metaphysical problems regarding the concept of "meaning".....

iii. Although the concepts of "formal" and "informal" often seem to be treated as irreconcilable opposites -in fact they are more like a graduated spectrum, based on the degree to which the organizational and functional rules are explicit. In this sense, a physical "artwork" might be given a place on this spectrum -because its physical manifestation makes explicit some (if not all) of the rules which underlie its execution and interpretation.

iv. In formal and informal systems (or languages), axioms and definitions form similar paradigmatic roles in shaping the space in which any transformations or interpretations must take place.

v. From personal experience we may additionally note that there is a difference between inventing a language and using that language. When a language (or system) is invented the inventor may not be able to oversee all the implications and uses to which their discovery may lead to. It is indeed through the successful practical use of a language to simulate (or express) certain systems -that the power and suitability of that language for any given task becomes clear.



2.0   -Finding A Personal Context for "Media Art":

2.1  A Romantic Escape from the Industrial Monster

Since his death in 1900, the influence of  John Ruskin kept a powerful grip on the English art world, and beyond -so that many cultural aficionados have had great difficulty in understanding technology in relation to the world of Art.

John Ruskin (8 February 1819 – 20 January 1900) was the leading English art critic of the Victorian era, as well as an art patron, draughtsman, watercolourist, philosopher, prominent social thinker and philanthropist.....

..... He was hugely influential in the latter half of the 19th century and up to the First World War. After a period of relative decline, his reputation has steadily improved since the 1960s with the publication of numerous academic studies of his work. Today, his ideas and concerns are widely recognised as having anticipated interest in environmentalism, sustainability and craft.

Ruskin first came to widespread attention with the first volume of Modern Painters (1843), an extended essay in defence of the work of J. M. W. Turner in which he argued that the principal role of the artist is "truth to nature". From the 1850s, he championed the Pre-Raphaelites, who were influenced by his ideas. His work increasingly focused on social and political issues......

.....Ruskin argued that it was an expression of the artisan's joy in free, creative work. The worker must be allowed to think and to express his own personality and ideas, ideally using his own hands, rather than machinery.

We want one man to be always thinking, and another to be always working, and we call one a gentleman, and the other an operative; whereas the workman ought often to be thinking, and the thinker often to be working, and both should be gentlemen, in the best sense. As it is, we make both ungentle, the one envying, the other despising, his brother; and the mass of society is made up of morbid thinkers and miserable workers. Now it is only by labour that thought can be made healthy, and only by thought that labour can be made happy, and the two cannot be separated with impunity.

— John Ruskin, The Stones of Venice vol. II: Cook and Wedderburn 10.201.

This was both an aesthetic attack on, and a social critique of, the division of labour in particular, and industrial capitalism in general. This chapter had a profound impact, and was reprinted both by the Christian socialist founders of the Working Men's College and later by the Arts and Crafts pioneer and socialist William Morris
(24) <
https://en.wikipedia.org/wiki/John_Ruskin> (Accessed May 11, 2021)

Ruskin's work seems quite understandable in terms of a reaction against the horrors of the Victorian Industrial revolution, with its fire belching iron works, dangerous factory conditions, its pollution of natural resources and its destruction of the artisan class.

The Arts and Crafts movement emerged from the attempt to reform design and decoration in mid-19th century Britain. It was a reaction against a perceived decline in standards that the reformers associated with machinery and factory production.
 (25)<
https://en.wikipedia.org/wiki/Arts_and_Crafts_movement> (Accessed May 11, 2021)

This romantic, escapist, approach  remained fairly central to the (British) cultural mainstream until challenged by the pre-WWII German "Bauhaus" movement -which strove to introduce an aesthetic more suitable to the machine age: "Form follows function" became the new motto and a new machine friendly aesthetic evolved.

"The Staatliches Bauhaus (German: [ˈʃtaːtlɪçəs ˈbaʊˌhaʊs], commonly known as the Bauhaus (German: "building house"), was a German art school operational from 1919 to 1933 that combined crafts and the fine arts. The school became famous for its approach to design, which attempted to unify the principles of mass production with individual artistic vision and strove to combine aesthetics with everyday function.
 (26)
<https://en.wikipedia.org/wiki/Bauhaus> (Accessed May 11, 2021)

2.2  An Alternative Tradition:

There were of course several other conceptual threads, mostly in Mainland Europe, which supported the development of the German Bauhaus: In this context a "thread" is understood to a connected chain of related ideas or phenomena.

Constructivism was an artistic and architectural philosophy that originated in Russia beginning in 1915 by Vladimir Tatlin and Alexander Rodchenko. Abstract and austere, constructivist art aimed to reflect modern industrial society and urban space. The movement rejected decorative stylization in favor of the industrial assemblage of materials. Constructivists were in favour of art for propaganda and social purposes, and were associated with Soviet socialism, the Bolsheviks and the Russian avant-garde.

Constructivist architecture and art had a great effect on modern art movements of the 20th century, influencing major trends such as the Bauhaus and De Stijl movements. Its influence was widespread, with major effects upon architecture, sculpture, graphic design, industrial design, theatre, film, dance, fashion and, to some extent, music......  ......The term Construction Art was first used as a derisive term by Kazimir Malevich to describe the work of Alexander Rodchenko in 1917. Constructivism first appears as a positive term in Naum Gabo's Realistic Manifesto of 1920.
 (27)
<https://en.wikipedia.org/wiki/Constructivism_(art)> (Accessed May 11, 2021)

Perhaps one could view the Ruskin tradition as being more "informal" (in the terms discussed in section 1) -while the Constructionist tradition was interested in understanding a more "formal" approach to the production of art (and design).

De Stijl , Dutch for "The Style", also known as Neoplasticism, was a Dutch art movement founded in 1917 in Leiden. De Stijl consisted of artists and architects. In a more narrow sense, the term De Stijl is used to refer to a body of work from 1917 to 1931 founded in the Netherlands. Proponents of De Stijl advocated pure abstraction and universality by a reduction to the essentials of form and colour; they simplified visual compositions to vertical and horizontal, using only black, white and primary colors. (28) <https://en.wikipedia.org/wiki/De_Stijl> (Accessed May 11, 2021)

Indeed, a more formal approach may be essential to any understanding of a machine based world -simply because the functioning of the "machine" needs to be formalized (made explicit) and understood before any machine construction can begin.

By the 1960's it seems that the tradition of Ruskin, although being revived -was also being seriously challenged from different sides:

Pop art is an art movement that emerged in the United Kingdom and the United States during the mid- to late-1950s. The movement presented a challenge to traditions of fine art by including imagery from popular and mass culture, such as advertising, comic books and mundane mass-produced objects. One of its aims is to use images of popular (as opposed to elitist) culture in art, emphasizing the banal or kitschy elements of any culture, most often through the use of irony. It is also associated with the artists' use of mechanical means of reproduction or rendering techniques. In pop art, material is sometimes visually removed from its known context, isolated, or combined with unrelated material.
 (29)
<https://en.wikipedia.org/wiki/Pop_art> (Accessed May 11, 2021)

 One could perhaps see "Pop Art" as being a shift of emphasis from "Nature" to "Society" (plus an acceptance of both popular culture and machine reproduction). However, within the author’s perspective, there was perhaps little fundamental change from the idea of art as a reflection of, or on, an (idealized) natural or social environment.

The images of the Constructivists however, were literally constructed by the artist... They do not exist in nature or society (except perhaps as abstract (mathematical) concepts. The "Constructivist/Bauhaus" tradition was therefore fundamentally much more artificial in essence than other art forms of the time.

Both nature and society are organic (autonomously evolving) systems -while a machine needs to be consciously constructed according to some basic plan. 


A “construct” is usually considered to be man made -and therefore more ‘artificial” than autonomous natural or social developments.


A construct also tends to have a “function” rather than a “meaning” -which might be more easy to understand as a concept than the somewhat more problematic concept of “meaning”.

2.3  A Flood of Artistic Innovation:

By the 1960's we  had Kinetic Art and EAT _Experiments in Art and technology as alternatives to Ruskin's approach. The Journal "Leonardo" also supported a more "rational" and "technological" approach to visual art. A flourishing opposition to the tradition of Ruskin seemed to be developing.

Kinetic art is art from any medium that contains movement perceivable by the viewer or depends on motion for its effect. Canvas paintings that extend the viewer's perspective of the artwork and incorporate multidimensional movement are the earliest examples of kinetic art. More pertinently speaking, kinetic art is a term that today most often refers to three-dimensional sculptures and figures such as mobiles that move naturally or are machine operated .... ....The moving parts are generally powered by wind, a motor or the observer. Kinetic art encompasses a wide variety of overlapping techniques and styles. ..... .... By the early 1900s, certain artists grew closer and closer to ascribing their art to dynamic motion. Naum Gabo, one of the two artists attributed to naming this style, wrote frequently about his work as examples of "kinetic rhythm". He felt that his moving sculpture Kinetic Construction (also dubbed Standing Wave, 1919–20) was the first of its kind in the 20th century. From the 1920s until the 1960s, the style of kinetic art was reshaped by a number of other artists who experimented with mobiles and new forms of sculpture.
 (30)
<https://en.wikipedia.org/wiki/Kinetic_art> (Accessed May 12, 2021)

" Kepes, who came to MIT in 1946, edited and published the influential seven-volume Vision and Value series in 1965-66. In 1967 he founded MIT’s Center for Advanced Visual Studies (CAVS), a laboratory for interdisciplinary art practice and artistic research and the first one of its kind."
 (31) <
http://act.mit.edu/people/director/gyorgy-kepes/> (Accessed May 12, 2021)


"The founding of E.A.T. in 1966 with Robert Whitman and Fred Waldhauer illustrates another important collaboration for Rauschenberg, between art and technology. Soundings—Rauschenberg’s first work with E.A.T. engineers—is the culmination of his rejection of the “predigested” image..... .....While Rauschenberg worked with technology for almost forty years of his career, his technology-based works have a relatively small exhibition history and have received less critical attention than his other work."
(32) <
https://www.artsy.net/article/ashley-duhrkoop-less-like-a-target-more-like-a> (Accessed May 12, 2021)

Leonardo is a peer-reviewed academic journal published by the MIT Press covering the application of contemporary science and technology to the arts and music.....  ..... Leonardo journal was established in 1968 by artist and scientist Frank Malina in Paris, France. Leonardo has published writings by artists who work with science- and technology-based art media for 50 years.
 (33) <
https://en.wikipedia.org/wiki/Leonardo_(journal)> (Accessed May 12, 2021)

2.4  A Unifying Thought?

Machine: 1 a piece of equipment with moving parts that uses power such as electricity to do a particular job
  (34) <
https://www.ldoceonline.com/dictionary/machine> (Accessed May 12, 2021)

A Machine is a unified collection of parts -which means that a machine is a "system". Both nature and society are also (semi-) unified collections of parts.

Could "Systems Theory" provide a unifying set of concepts?

Systems theory is the interdisciplinary study of systems, which are cohesive groups of interrelated, interdependent parts that can be natural or human-made. Every system is bounded by space and time, influenced by its environment, defined by its structure and purpose, and expressed through its functioning. A system may be more than the sum of its parts if it expresses synergy or emergent behavior.
  (35)
<https://en.wikipedia.org/wiki/Systems_theory> (Accessed May 12, 2021)

2.5  1984 -A Psychological Turning Point (In the Wrong Direction):

By 1984 the neo-liberal economic policies promoted by U.S. President Ronald Reagan and British Prime Minister Margaret Thatcher were causing some people to wonder if the dystopian novels of Orwell and Huxley might not turn out to be quite prophetic.

Nineteen Eighty-Four: A Novel, often referred to as 1984, is a dystopian social science fiction novel by English novelist George Orwell. It was published on 8 June 1949 by Secker & Warburg as Orwell's ninth and final book completed in his lifetime. Thematically, Nineteen Eighty-Four centres on the consequences of totalitarianism, mass surveillance, and repressive regimentation of persons and behaviours within society. Orwell, himself a democratic socialist, modelled the authoritarian government in the novel after Stalinist Russia. More broadly, the novel examines the role of truth and facts within politics and the ways in which they are manipulated.

The story takes place in an imagined future, the year 1984, when much of the world has fallen victim to perpetual war, omnipresent government surveillance, historical negationism, and propaganda.

In the decades since the publication of Nineteen Eighty-Four, there have been numerous comparisons to Huxley's Brave New World, which had been published 17 years earlier, in 1932. They are both predictions of societies dominated by a central government and are both based on extensions of the trends of their times. However, members of the ruling class of Nineteen Eighty-Four use brutal force, torture and mind control to keep individuals in line, while rulers in Brave New World keep the citizens in line by addictive drugs and pleasurable distractions. Regarding censorship, in Nineteen Eighty-Four the government tightly controls information to keep the population in line, but in Huxley's world, so much information is published that readers do not know which information is relevant, and what can be disregarded.

Elements of both novels can be seen in modern-day societies, with Huxley's vision being more dominant in the West and Orwell's vision more prevalent with dictators. including those in communist countries, as is pointed out in essays that compare the two novels, including Huxley's own Brave New World Revisited.
 (36)
<https://en.wikipedia.org/wiki/Nineteen_Eighty-Four> (Accessed May 12, 2021)

Meanwhile, the Two Cultures problem apparently also remained generally unresolved:

"The Two Cultures" is the first part of an influential 1959 Rede Lecture by British scientist and novelist C. P. Snow which was published in book form as The Two Cultures and the Scientific Revolution the same year. Its thesis was that science and the humanities which represented "the intellectual life of the whole of western society" had become split into "two cultures" and that this division was a major handicap to both in solving the world's problems.
  (37)
<https://en.wikipedia.org/wiki/The_Two_Cultures> (Accessed May 12, 2021)

However, just as some artists were becoming involved in a more formal approach to art production -the science of "Cybernetics" was also making some inroads into the social sciences and art.

Cybernetics is a transdisciplinary approach for exploring regulatory and purposive systems—their structures, constraints, and possibilities. The core concept of the discipline is circular causality or feedback—that is, where the outcomes of actions are taken as inputs for further action. Cybernetics is concerned with such processes however they are embodied, including in environmental, technological, biological, cognitive, and social systems, and in the context of practical activities such as designing, learning, managing, and conversation.

Cybernetics has its origins in the intersection of the fields of control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology in the 1940s, often attributed to the Macy Conferences. Since then, cybernetics has become even broader in scope to include work in domains such as design, family therapy, management and organisation, pedagogy, sociology, and the creative arts. At the same time, questions arising from circular causality have been explored in relation to the philosophy of science, ethics, and constructivist approaches. Contemporary cybernetics thus varies widely in scope and focus, with cyberneticians variously adopting and combining technical, scientific, philosophical, creative, and critical approaches.......

In science, the human mind and individuals are often observed as autonomous and interconnected systems, allowing the cybernetic approach to be leveraged in those fields of study as well...... .....  By examining group behavior through the lens of cybernetics, sociologists can seek the reasons for such spontaneous events as smart mobs and riots, as well as how communities develop rules such as etiquette by consensus without formal discussion. Affect Control Theory explains role behavior, emotions, and labeling theory in terms of homeostatic maintenance of sentiments associated with cultural categories.
 (38) 
<https://en.wikipedia.org/wiki/Cybernetics> (Accessed May 12, 2021)


However, despite these various cultural shifts, even in the 80's it was still fairly inconceivable that non-technical people would, or could, become involved with computers. Until the arrival of the Macintosh.

The Macintosh (mainly Mac since 1998) is a family of personal computers designed, manufactured, and sold by Apple Inc. since January 1984.

The original Macintosh is the first successful mass-market personal computer to have featured a graphical user interface, built-in screen, and mouse. Apple sold the Macintosh alongside its popular Apple II, Apple III, and Apple Lisa families of computers until the other models were discontinued in the 1990s.

The Macintosh was introduced by a US$1.5 million Ridley Scott television commercial, "1984". It aired during the third quarter of Super Bowl XVIII on January 22, 1984, and is now considered a "watershed event" and a "masterpiece". McKenna called the ad "more successful than the Mac itself." "1984" used an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of the computer on her white tank top) as a means of saving humanity from the "conformity" of IBM's attempts to dominate the computer industry. The ad alludes to George Orwell's novel Nineteen Eighty-Four which described a dystopian future ruled by a televised "Big Brother
 (39) <
https://en.wikipedia.org/wiki/Macintosh> (Accessed May 12, 2021)

The main selling point for the Macintosh was the GUI (Graphical User Interface) which meant that, instead of typing commands into the computer, it could be operated by one hand clicking on symbols representing the task  to be performed.

Suddenly, the computer was no longer obviously a rule based functional machine needing a technical understanding to use -but an enjoyable and perhaps even amusing toy-like commercial product that could be controlled entirely by the simple gestures of pointing and clicking. The formal nature of  computers  had been discarded -and everybody could now play freely with them, having been relieved of the need to understand what they were doing.

This ease of operation was also cleverly combined with easy to use programs that mimicked the early experiments of avant guarde artists and designers. The subversive Dadaist collage had now become a basic tool for "Cut and Paste" techniques in a range of creative fields.

Paint for Mac is designed to let people explore their creativity or complete simple image editing tasks. The easy-to-use software serves as a reminder of the good old days when people were getting familiar with using computers. The versatile tool comes with all the features that Paintbrush for Windows was known for and lets users create beautiful artwork in no time.
 (40) 
<https://paintbrush.en.softonic.com/mac> (Accessed May 12, 2021)

Living in Holland in 1984, it seemed to the author that people from the non-technical "social sector' held their breath for a short while wondering if 1984 would literally materialize: Then when the Macintosh appeared, they gave a big sigh of relief and immediately started implementing the shadowy world of ultra-subjective postmodernism: Doublethink at its best!

It is the author's belief that the doublethink resulting from the rejection of almost all forms of formal thinking by the cultural literati has led directly to much of the irrational and often violent social extremism witnessed today. 

The reinterpretation of a formal computational  machine as an informal communication machine has encouraged the spread of ideas without any form of (formal) constraint upon them. Under these conditions, social harmony (or even mutual understanding) would seem impossible -simply because there is no common system of logic or belief that could form a bridge between opposing parties.

3.0 Towards A More Personal Integrated Approach

3.1 A Short Personal History:

My own art educational experience was mostly between 1962 and 1967, roughly at the intersection of the above mentioned  Ruskinian and Constructivist approaches to art production:  My first year at art school was decidedly Ruskin based -and part of a, by then outmoded, national exam system -while my second year (at a different art college) was more Bauhais orientated. However, my third year, in which I started a newly  implemented degree level course (with exams set by the college and not the government) was decidedly back to Ruskin and nature. Ruskin's inheritance was a constant source of conflict between my mentors and myself.

After a series of chance meetings, I became interested in the idea of using a computer as a medium for making art. With the help of a Mathematics Ph.D  student at Exeter university, in 1967 drawings made with a plotter attached to a computer, programmed in Algol, became part of my final examination work in sculpture and printmaking at Exeter College of Art in the UK.

In 1972, two screen prints based on computer generated drawings were exhibited by me in the Bradford Print Biennial of that year.

This was the same year I left England for Holland -in order to gain access to a computer -which at that time were only accessible via educational or research institutions. For around ten years I was actively involved in personal research at the Institute for Sonology, then in Utrecht.


Unfortunately, I was not a musician but a tone-deaf visual artist (with no musical background) studying in an environment entirely dedicated to classical electronic studio music: So I could not fully understand the language of music -but I could appreciate and learn from its integrated approach to cognitive, emotional and sensory modes of thought and feeling.

Then, when I finally returned to the visual art world I was astonished by its apparent failure, and even lack of desire, to develop a coherent and meaningful language comparable to that of music.

By language in this context I mean the language that is used by the creative individual to model (and thus create) the system that is the (auditory or visual) artwork itself. This should not be confused with the language that is used by the listener or viewer (and theorist) to make some sense of the material presented to them. These are two different languages with two different functions.

 

So starting as an outsider in the world of music -I had also become an outsider in the world of visual art. An interest in a pan-disciplinary approach to the various media of artistic expression was inevitable.

My experiences at the Institute for Sonology had laid the foundation for my further work as an artist/programmer, a teacher and an art theorist.

3.2 A Short Personal Hypothesis regarding the Need to Understand Systems:

3.2.0 Languages describe Systems:

In the tradition of Ruskin the "artwork" is a representational language which refers to something outside the  artwork itself.  Although the artwork itself is generally appreciated too, it is the system of interpretations outside the artwork that is often considered to be the main issue. The artwork itself is then merely an intermediary language (or medium)  for the main subject referred to by the artwork.

However, within the Constructionist tradition, the "artwork" is usually a system that refers to itself. These are not contradictory truths -they are just different axiomatic systems.

It seems to me that if one asks what is the true nature of "that which is outside the artwork but is its main subject" -then one must conclude that it is either another language referring to something else outside itself (in an infinite regress) -or it is ultimately a system that only refers to itself.

The artist may create an image that represents (or suggests) in some way a society struggling with some social problem -but society is in fact a system and the "problem" is actually symptomatic of the inner structure of that social system. It is the belief of the author that, in this case, looking at the actual structure of the system is more likely to solve the problem than fixating on the "expression" of the problem through the artwork.

One may need to be aware of a problem before one can solve it -but in order to solve the underlying problem one needs to be able to define the system which generates the problem one wishes to solve: This is necessary in order to discover exactly which internal or external interactions are actually creating the "problem".

Although these remarks apply to problems solving in general, the primary “problem” faced by the artist is the practical problem of how to create the work of art:

Will the work be based on existing assumptions (either those of the artist or those in more general use) or will the work be based on new assumptions?

In this context the task of the artist is basically to select a set of assumptions and then look for ways to implement these through the work. Existing assumptions can of course be implemented in novel ways. Via the various processes involved in the creation of a work of art the “language of art” is itself extended.

Work based on new assumptions automatically creates a paradox -because the work automatically presents the problem that although the work does not obey the existing assumptions of art production -it is still to be considered as “art”. So how does one decide if this new work can be accepted or not?

Under these conditions the relevance and importance of the various assumptions (axioms) can be effectively tested.

It is therefore necessary for the artist to decide if they are aiming to confirm existing assumptions -or trying to transcend existing ones. An idea may be useful in one context but not in another. Because contexts are important, it is essential for the artist to understand exactly in which context they are currently operating (often within a complex nexus of contexts).

It is also important to note that the work of art tests the functionality of the assumptions (within the context of the art work itself) -it does not test their “truth” value -because “truth” has no meaning outside a specific nexus of assumptions. If Joe is Mary’s uncle then it is true that Mary is Joe’s niece. But this is only true within that family: It is not true that outside the family, everybody called Mary must be a niece of everybody called Joe. In another family, somewhere, Joe might, for example, be the brother of Mary.

3.2.1 Axioms and fashion:

The theories of Ruskin may appear outmoded nowadays, but they were effectively active in the author's formative years.

Their own formative years are important to most people and in a changing society with a range of age differences there will also be a range of formative influences, which in turn can lead to a diversity of beliefs and conceptual approaches: Different ages, different experiences. Variations in geographical locations and personal or cultural situations are also liable to lead to variations in an individual's personal experience.

If managed properly this can lead to a creative exchange of ideas and values -but if this diversity is not managed properly it can lead to conflict and social polarization.

In a society which has a powerful belief in truth and progress, ideas and theories considered outmoded (and therefore false) are likely to be discarded so the latest fashion can take their place.

If however, theories are seen as axioms -where the truth value is irrelevant outside of the system of relationships that develop naturally from these axioms -then the theory retains its value in terms of the system it called into being. One may still reject a system of thought -for a variety of reasons -but then these reasons should at least have some rational grounding (related to their function) and not be based on the vagaries of intellectual or social fashion.

In science, Quantum Physics hasn’t entirely replaced Relativity Theory -which hasn’t entirely replaced Newtonian Physics. Each of these theories has a practical use within its appropriate domain of application. The wider our choice of axiomatic system the greater the chance of finding the most suitable conceptual tool for solving any practical  problem we might face.

Society should not be arguing over the truth or falsehood of ideas (which are ultimately virtually impossible to prove ) -but instead debate the usefulness of specific ideas in relation to the solution of  specific problems: Under which conditions might the idea be useful and under which circumstances is it less useful? Sometimes an idea which is of no use in one context might lead to an important breakthrough in another context.

3.2.2 An Ecology of Differences:

Ultimately, there is no absolute "meaning" -only an entangled set of interconnected sub-systems that interact in ways that are experienced as either positive or negative to the participants.

 Deciding over the advantages and disadvantages of any system will however remain "subjective" (a matter of individual or group choice) and problematic as long as people have different needs and desires -and live under different circumstances.

Thermodynamic theory and ecological practice suggest that differences within the system are what  keeps the system going. An entirely homogeneous society would have no differential to stimulate energy flow -and it would die.

However, the differences within society can also cause it to self-destruct, if these differences lead to violent conflict and are not harmoniously and creatively integrated within our social  systems.


3.2.3 Diversity as a Scarce Resource:

In a society vibrant with the diversity of conceptual dialogue, diversity becomes a valuable creative resource and is not just an empty buzzword.

In an unpolarized world, the range of subtle nuances between the various available social choices are obviously much wider than in a polarized system where one basically has a maximum of  two choices (for -or against). Sometimes, social pressures even remove the possibility of opposing something.  

The more choices there are to explore -the greater the chance that one or more of these choices will lead to productive discoveries.

3.2.4 Preserving Conceptual Diversity:

There are in fact many popular trends which can lead to a reduction of diversity.

The more internally connected a system is, the more it will tend  towards homogeneity.

Mass communication systems and globalized distribution systems tend to reduce local differences. Similar experiences tend to support similar belief systems. The more physical diversity is reduced -the more conceptual diversity is likely to be reduced too.

Although the role of mass media and social engineering is way beyond the scope of this paper -it is important to be aware of the social consequences of any reduction in the levels of conceptual diversity within a community and between communities.

The commercial exploitation of our cultural heritage is leading to an increase in the value of the certified professional cultural expert. This manifests itself in the way the power of art theory seems to be gaining within an over-intellectualized art practice based on public funding: This is encouraged by an upgraded (university) level art education -which must surely either undermine the intellectual integrity of the  academic system -or lead artists to see intellectual systems as being more important than sensory systems (which are generally the more traditional backbone of the arts). How does one fit the personal and emotive aspects of art into an academic context?

 How wise is it to globalize the cultural experiences and cultural wisdom of various communities around the world? Is it wise to remove cultural practices from their natural birthplaces? Can one set of cultural beliefs fit all people in all locations for all times?

Academic theory, by setting standards for the education of artists, the funding of their works through grants -and the dissemination of the work via publicly funded distribution systems encourages certain dogmas to flourish and others to be still-born. Under these conditions, art practice is more likely to be led by art theory -instead of the theory growing naturally out of the practice. The more unified the global educational system becomes, the greater the dangers of intellectual conformity.


When art theory becomes prescriptive -rather than descriptive and explanatory -then there is no place any more for artists. If innovative artistic practice has no place in the system -then who or what will generate the required level of social diversity for a healthy social dialogue?

Can “Best Practices” really be suitable for all situations -or are they able to differentiate and operate effectively under different conditions in a variety of cultural contexts?




3.3.0 Playing the Game:

Ritual processes, as practiced in sport, art and science (and perhaps life itself) are all rule based games.


Because games are systems that essentially refer to themselves, the meaning of the game is embedded in the process of playing the game.


One may use a game metaphorically as a language, referring specifically to something outside itself -but the inherent value and meaning of the game lies within itself.


To play the game well, one has to understand the rules of the game.

3.4.0 Ends and Means:

If a person fully understands the "Means" to achieve something then the resulting "Ends" can often be controlled by them: Knowledge is Power.


However, if one only understands the desired "Ends", then one has no understanding of the "Means" to achieve them: Ignorance is disenfranchising.

3.5.0 The Two Tribes:

To the author, it seems that this problem of interpretation (or not) is rather like a huge mountain with its summit way above the clouds:

Below the clouds live a tribe called the “Syntactics” -while above the cloud live a tribe called the Semantics". Nobody ever dares to pass through the cloud, so the two tribes never meet and have no knowledge or understanding of each other's existence. The Syntactics, being below the cloud, understand how their world is rooted in the rest of the planet and they understand how the rain from the cloud nourishes the soil and helps their crops grow. The Syntactics do not bother themselves with what might exist above the cloud. Life is full of hard work tilling the soil and harvesting the crops -but they also enjoy the fruits of their labour and regularly celebrate their good fortune. They live relatively harmoniously, because they understand the way their various differences contribute to  the well-being of all.

The Semantics however, believe that their world floats on a cloud, and there is nothing underneath that cloud. They eat the mushrooms that grow there but do not ask themselves how or why they grow. They do not believe in physical labour but spend their time arguing over all sorts of things that exist only in the minds of the arguing parties. Disharmony rules because they have no way to integrate the various differences of opinion and belief that each individual professes.

In order to harmoniously integrate opposing forces we have to stay beneath the cloud on the mountain so that we can understand the Syntactics and not be led into an ultimately meaningless conflict by the Semantics.


Trevor Batten
Baclayon, Bohol
The Philippines
May 10 - November 30 2021

This paper was originally written for the Journal of Comparative Literature and Aesthetics <http://jcla.in/>
But was rejected for being too Hermetic and Eccentric



----------------------------------------

The Two Cultures Problem
Society Art and Technology
Think Tank
ISSUE
Some Earlier Theoretical Texts
 
----------------------------------------





 
Trevor Batten
 <trevor at tebatt dot net>
 Baclayon  2022
 home