Search Google Appliance


Faculty: Martin Zwick Publication Abstracts

(most recent papers on top; click on research area for paper list)

Discrete Multivariate Modeling | ALife | Systems Theory and Philosophy

Discrete Multivariate Modeling

Reconstructability Analysis of Genetic Loci Associated with Alzheimer Disease

Patricia Kramer, Shawn Westaway, Martin Zwick, and Stephen Shervais

Reconstructability Analysis (RA) is an information- and graph-theory-based method which has been successfully used in previous genomic studies.  Here we apply it to genetic (14 SNPs) and non-genetic (Education, Age, Gender) data on Alzheimer disease in a well-characterized Case/Control sample of 424 individuals.  We confirm the importance of APOE as a predictor of the disease, and identify one non-genetic factor, Education, and two SNPs, one in BINI and the other in SORCS1, as likely disease predictors.  SORCS1 appears to be a common risk factor for people with or without APOE.  We also identify a possible interaction effect between Education and BINI.  Methodologically, we introduce and use to advantage some more powerful features of RA not used in prior genomic studies.
View full article

Reconstructability of Epistatic Functions

Martin Zwick, Joe Fusion, and Beth Wilmot

Background: Reconstructability Analysis (RA) has been used to detect epistasis in genomic data [1]; in that work, even the simplest RA models (variable-based models without loops) gave performance superior to two other methods.  A follow-on theoretical study [2] showed that RA also offers higher-resolution models, namely variable-based models with loops and state-based models, likely to be even more effective in modeling epistasis, and also described several mathematical approaches to classifying types of epistasis.  Methods: The present paper extends this second study by discussing a non-standard use of RA: the analysis of epistasis in quantitative as opposed to nominal variables; such quantitative variables are, for example, encountered in genetic characterizations of gene expression, e.g., eQTL data. Three methods are investigated for applying variable- and state-based RA to quantitative dependent variables: (i) k-systems analysis, which treats continuous function values as pseudo-frequencies, (ii) b-systems analysis, which derives continuous values from binned DVs using expected value calculations, and (iii) u-systems analysis, which treats continuous function values as pseudo-utilities subject to a lottery.  These methods are demonstrated and compared on synthetic data.  Results: The three methods of k-, b-, and u-systems analyses, both variable-based and state-based, are then applied to a published SNP dataset.  A preliminary search is done with b-systems analysis, followed by more refined k- and u-systems searches.  The analyses suggest candidates for epistatic interactions that affect the level of gene expression.  As in the synthetic data studies, state-based RA is more powerful than variable-based RA. Conclusions: While the previous RA studies looked at epistasis in nominal (or discretized) data, this paper shows that RA can also analyze epistasis in quantitative expression data without discretizing this data.  Since RA can also model epistasis in frequency distributions and detect linkage disequilibrium, its successful application here also to continuous functions shows that it offers a flexible methodology for the analysis of genomic interaction effects.
View full article

Reconstructability Analysis of Epistasis

Martin Zwick

The literature on epistasis describes various methods to detect epistatic interactions and to classify different types of epistasis. Reconstructability analysis (RA) has recently been used to detect epistasis in genomic data. This paper shows that RA offers a classification of types of epistasis at three levels of resolution (variable-based models without loops, variable-based models with loops, state-based models). These types can be defined by the simplest RA structures that model the data without information loss; a more detailed classification can be defined by the information content of multiple candidate structures. The RA classification can be augmented with structures from related graphical modeling approaches. RA can analyze epistatic interactions involving an arbitrary number of genes or SNPs and constitutes a flexible and effective methodology for genomic analysis.
View full article (unformatted)
The official (formatted) pdf of this article is available from the journal, or from PubMed, or, for researchers in non-profit institutions, from the author by request.

Reconstructability Analysis as a Tool for Identifying Gene-Gene Interactions in Studies of Human Diseases

Stephen Shervais, Patricia Kramer, Shawn Westaway, Nancy Cox and Martin Zwick

There are a number of common human diseases for which the genetic component may include an epistatic interaction of multiple genes. Detecting these interactions with standard statistical tools is difficult because there may be an interaction effect, but minimal or no main effect. Reconstructability analysis (RA) uses Shannon’s information theory to detect relationships between variables in categorical datasets. We applied RA to simulated data for five different models of gene-gene interaction, and find that even with heritability levels as low as 0.008, and with the inclusion of 50 non-associated genes in the dataset, we can identify the interacting gene pairs with an accuracy of 80%. We applied RA to a real dataset of type 2 non-insulin-dependent diabetes (NIDDM) cases and controls, and closely approximated the results of more conventional single SNP disease association studies. In addition, we replicated prior evidence for epistatic interactions between SNPs on chromosomes 2 and 15. 
View full article
View earlier published version of this paper

Reconstructability Analysis Detects Genetic Variation Associated with Gene Expression

Beth Wilmot, Martin Zwick, Shannon McWeeney

Gene expression is a quantitative trait which varies among individuals and can be affected by polymorphisms in DNA sequence. Integrative analysis of gene expression and genetic variation patterns provides a functional context for discovery of complex trait susceptibility genes.
Reconstructability analysis (RA) is an information-theoretic methodology for multivariate categorical data that overlaps log-linear modeling and Bayesian networks. With the simplest RA analysis, namely the calculation of mutual information, we detected associations between SNP alleles and binned gene expression values. We used RA to replicate and extend SNP--gene expression associations in a public data set (Myers et al 2007, Nature Genetics 39:1494). For the purposes of this pilot study, only the data from SNPs and expression from the published positive cis-acting SNP – gene expression associations was included. True epistatic interactions were identified by low linkage disequilibrium (LD) between the SNP pairs and a significant information gain.
RA identified all but two cis-acting SNPs (false negative rate = 2.7%) and detected additional cis-acting SNPs. In addition, RA identified cis-, trans-acting SNP pairs significantly associated with gene expression among these genes, indicating RA may be well suited for detection of epistatic interactions.

Using Reconstructability Analysis for Input Variable Reduction: A Business Example

Stephen Shervais and Martin Zwick

We demonstrate the use of Reconstructability Analysis (RA) on the UCI Australian Credit dataset to reduce the number of input variables for two different analysis tools. Using 14 variables, an artificial neural net (NN) is able to predict whether or not credit was granted, with a 79.1% success rate. RA preprocessing allows us to reduce the number of independent variables from 14 to two different sets of three, which have success rates of 77.2% and 76.9% respectively. The difference between these rates and that of the 14-variable NN is not statistically significant. The three-variable rulesets given by RA achieve success rates of 77.8% and 79.7%. Again, the difference between those values and the 14-variable NN is not statistically significant, that is, our approach provides a three-variable model that is competitive with the 14-variable equivalent. 
View full article

Ordering Genetic Algorithm Genomes With Reconstructability Analysis: Discrete Models

Stephen Shervais and Martin Zwick

The building block hypothesis implies that genetic algorithm effectiveness is influenced by the relative location of epistatic genes on the chromosome. We demonstrate this with a discrete-valued problem, based on Kauffman's NK model, and show that information-theoretic reconstructability analysis can be used to decide on optimal gene ordering. 
View full article

Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

Adam Wright, Thomas N. Ricciardi and Martin Zwick

The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. 
View full article

An Overview of Reconstructability Analysis

Martin Zwick

This paper is an overview of reconstructability analysis (RA), an approach to discrete multivariate modeling developed in the systems community. RA includes set-theoretic modeling of relations and information-theoretic modeling of frequency and probability distributions. It thus encompasses both statistical and non-statistical problems. It overlaps with logic design and machine learning in engineering and with log-linear modeling in the social sciences. Its generality gives it considerable potential for knowledge representation and data mining. 
View full article

Reconstructability Analysis with Fourier Transforms

Martin Zwick

Fourier methods used in 2- and 3-dimensional image reconstruction can be used also in reconstructability analysis (RA). These methods maximize a variance-type measure instead of information-theoretic uncertainty, but the two measures are roughly colinear and the Fourier approach yields results close to those of standard RA. The Fourier method, however, does not require iterative calculations for models with loops. Moreover the error in Fourier RA models can be assessed without actually generating the full probability distributions of the models; calculations scale with the size of the data rather than the state space. State-based modeling using the Fourier approach is also readily implemented. Fourier methods may thus enhance the power of RA for data analysis and data mining. 
View full article

Multi-Level Decomposition of Probabilistic Relations

Stanislaw Grygiel, Martin Zwick and Marek Perkowski

Two methods of decomposition of probabilistic relations are presented. They consist on splitting relations (blocks) into pairs of smaller blocks related to each other by new variables generated in such a way as to minimize certain cost function which depends on the size and structure of the result. The decomposition is repeated iteratively until a stopping criterion is met. Topology and contents of the resulting structure develops dynamically in the decomposition process and reflects relationships hidden in the data. 
View full article

A Software Architecture for Reconstructability Analysis

Kenneth Willett and Martin Zwick

Software packages for Reconstructability Analysis (RA), as well as for related Log Linear modeling, generally provide a fixed set of functions. Such packages are suitable for end-users applying RA in various domains, but do not provide a platform for research into the RA methods themselves.

A new software system, Occam3, is being developed which is intended to address three goals which often conflict with one another: to provide (1) a general and flexible infrastructure for experimentation with RA methods and algorithms; (2) an easily-configured system allowing methods to be combined in novel ways, without requiring deep software expertise; and (3) a system which can be easily utilized by domain researchers who are not computer specialists.

Meeting these goals has led to an architecture which strictly separates functions into three layers: the Core, which provides representation of datasets, relations, and models; the Management Layer, which provides extensible objects for development of new algorithms; and the Script Layer, which allows the other facilities to be combined in novel ways to address a particular domain analysis problem. 
View full article | Occam3 Manual

A Comparison of Modified Reconstructability Analysis and Ashenhurst-Curtis Decomposition of Boolean Functions

Anas Al-Rabadi, Martin Zwick and Marek Perkowski

Modified Reconstructability Analysis (MRA), a novel decomposition technique within the framework of set-theoretic (crisp possibilistic) Reconstructability Analysis, is applied to 3-variable NPN-classified Boolean functions. MRA is superior to conventional Reconstructability Analysis (CRA), i.e. it decomposes more NPN functions. MRA is compared to Ashenhurst-Curtis (AC) decomposition using two different complexity measures: log-functionality, a measure suitable for machine learning, and the count of the total number of two-input gates, a measure suitable for circuit design. MRA is superior to AC using the first of these measures, and is comparable to, but different from AC, using the second. 
View full article

Reversible Modified Reconstructability Analysis of Boolean Circuits and Its Quantum Computation

Anas Al-Rabadi and Martin Zwick

Modified Reconstructability Analysis (MRA) can be realized reversibly by utilizing Boolean reversible (3,3) logic gates that are universal in two arguments. The quantum computation of the reversible MRA circuits is also introduced. The reversible MRA transformations are given a quantum form by using the normal matrix representation of such gates. The MRA-based quantum decomposition may play an important role in the synthesis of logic structures using future technologies that consume less power and occupy less space. 
View full article

Modified Reconstructability Analysis for Many-Valued Logic Functions

Anas Al-Rabadi and Martin Zwick

A novel many-valued decomposition is presented using the framework of Reconstructability Analysis. In previous work, modified Reconstructability Analysis (MRA) was applied to Boolean functions, where it was show that most Boolean functions not decomposable using conventional Reconstructability Analysis (CRA) are decomposable using MRA. Also, it was previously shown that whenever decomposition exists in both MRA and CRA, MRA yields simpler or equal complexity decompositions. In this paper, lossless set-theoretic MRA is extended tomany-valued logic functions, and logic structures that correspond to such decomposition are developed. 
View full article

State-Based Reconstructability Analysis

Martin Zwick and Michael S. Johnson

Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. While Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in terms of system states, their focus was the analysis and approximation of real-valued functions. In this paper, we separate two ideas that Jones had merged together: the “g to k” transformation and state-based modeling. We relate the idea of state-based modeling to established variable-based RA concepts and methods, including structure lattices, search strategies, metrics of model quality, and the statistical evaluation of model fit for analyses based on sample data. We also discuss the interpretation of state-based modeling results for both neutral and directed systems, and address the practical question of how state-based approaches can be used in conjunction with established variable-based methods. 
View full article

Reconstructability Analysis Detection of Optimal Gene Order in Genetic Algorithms

Martin Zwick and Stephen Shervais

The building block hypothesis implies that genetic algorithm efficiency will be improved if sets of genes that improve fitness through epistatic interaction are near to one another on the chromosome. We demonstrate this effect with a simple problem, and show that information-theoretic reconstructability analysis can be used to decide on optimal gene ordering. 
View full article

Directed Extended Dependency Analysis for Data Mining

Thaddeus T. Shannon and Martin Zwick

Extended Dependency Analysis (EDA) is a heuristic search technique for finding significant relationships between nominal variables in large datasets. The directed version of EDA searches for maximally predictive sets of independent variables with respect to a target dependent variable. The original implementation of EDA was an extension of reconstructability analysis. Our new implementation adds a variety of statistical significance tests at each decision point that allow the user to tailor the algorithm to a particular objective. It also utilizes data structures appropriate for the sparse datasets customary in contemporary data mining problems. Two examples that illustrate different approaches to assessing model quality tests are given. 
View full article

Ordering Genetic Algorithm Genomes With Reconstructability Analysis

Stephen Shervais and Martin Zwick

The building block hypothesis implies that genetic algorithm effectiveness is influenced by the relative location of epistatic genes on the chromosome. We find that this influence exists, but depends on the generation in which it is measured. Early in the search process it may be more effective to have epistatic genes widely separated. Late in the search process, effectiveness is improved when they are close together. The early search effect is weak but still statistically significant; the late search effect is much stronger and plainly visible. We demonstrate both effects with a set of simple problems, and show that information-theoretic reconstructability analysis can be used to decide on optimal gene ordering. 
View full article

Using Reconstructability Analysis to Select Input Variables for Artificial Neural Networks

Stephen Shervais and Martin Zwick

We demonstrate the use of Reconstructability Analysis to reduce the number of input variables for a neural network. Using the heart disease dataset we reduce the number of independent variables from 13 to two, while providing results that are statistically indistinguishable from those of NNs using the full variable set. We also demonstrate that rule lookup tables obtained directly from the data for the RA models are almost as effective as NNs trained on model variables. 
View full article

Enhancements to Crisp Possibilistic Reconstructability Analysis

Anas N. Al-Rabadi and Martin Zwick

Modified Reconstructibility Analysis (MRA), a novel decomposition within the framework of set-theoretic (crisp possibilistic) Reconstructibility Analysis, is presented. It is shown that in some cases while 3-variable NPN-classified Boolean functions are not decomposable using Conventional Reconstructibility Analysis (CRA), they are decomposable using Modified Reconstructibility Analysis (MRA). Also, it is shown that whenever a decomposition of 3-variable NPN-classified Boolean functions exists in both MRA and CRA, MRA yields simpler or equal complexity decompositions. A comparison of the corresponding complexities for Ashenhurst-Curtis decompositions, and Modified Reconstructibility Analysis (MRA) is also presented. While both AC and MRA decompose some but not all NPN-classes, MRA decomposes more classes, and consequently more Boolean functions. MRA for many-valued functions is also presented, and algorithms using two different methods (intersection and union) are given. A many-valued case is presented where CRA fails to decompose but MRA decomposes. 
View full article

An Information Theoretic Methodology for Prestructuring Neural Networks

Bjorn Chambless, George G. Lendaris, Martin Zwick

Absence of a priori knowledge about a problem domain typically forces uses of overly complex neural network structures. An information-theoretic method based on calculating information transmission is applied to training data to obtain a priori knowledge that is useful for prestructuring (reducing complexity) of neural networks. The method is applied to a continuous system, and it is shown that such prestructuring reduces the training time, and enhances generalization capability. 
View full article

Wholes and Parts in General Systems Methodology

Martin Zwick

Reconstructability analysis (RA) decomposes wholes, namely data in the form either of set-theoretic relations or multivariate probability distributions, into parts, namely relations or distributions involving subsets of variables. Data is modeled and compressed by variable-based decomposition, by more general state-based decomposition, or by the use of latent variables. Models, which specify the interdependencies among the variables, are selected to minimize error and complexity. 
View full article [ pdf | postscript ]

State-Based Reconstructability Modeling for Decision Analysis

Michael S. Johnson and Martin Zwick

Reconstructability analysis (RA) is a method for detecting and analyzing the structure of multivariate categorical data. Jones and his colleagues extended the original variable-based formulation of RA to encompass models defined in terms of system states (Jones 1982; Jones 1985; Jones 1985; Jones 1986; Jones 1989). In this paper, we demonstrate that Jones' previous work comprises two separable ideas: the "g to k" transformation and state-based modeling. We relate the concept of state-based modeling to established variable-based RA methods (Klir 1985; Krippendorff 1986), and demonstrate that state-based modeling, when applied to event and decision tree models, is a valuable adjunct to the variable-based sensitivity analyses commonly employed in risk and decision modeling. Examples are provided to illustrate the approach, and issues associated with the interpretation of state-based sensitivity analyses are explored. 
View full article [ pdf | postscript ]

Prestructuring Neural Networks for Pattern Recognition Using Extended Dependency Analysis

George G. Lendaris, Thaddeus T. Shannon and Martin Zwick

We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification con-text. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier. 
View full article [ pdf | postcript ]

Complexity Reduction in State-Based Modeling

Martin Zwick

For a system described by a relation among qualitative variables (or quantitative variables "binned" into symbolic states), expressed either set-theoretically or as a multivariate joint probability distribution, complexity reduction (compression of representation) is normally achieved by modeling the system with projections of the overall relation. To illustrate, if ABCD is a four variable relation, then models ABC:BCD or AB:BC:CD:DA, specified by two triadic or four dyadic relations, respectively, represent simplifications of the ABCD relation. Simplifications which are lossless are always preferred over the original full relation, while simplifications which lose constraint are still preferred if the reduction of complexity more than compensates for the loss of accuracy.

State-based modeling is an approach introduced by Bush Jones, which significantly enhances the compression power of information-theoretic (probabilistic) models, at the price of significantly expanding the set of models which might be considered. Relation ABCD is modeled not in terms of the projected relations which exist between subsets of the variables but rather in terms of a set of specific states of subsets of the variables, e.g., (Ai,Bj,Ck), (Cl,Dm), and (B n ). One might regard such state-based, as opposed to variable-based, models as utilizing an "event"- or "fact"-oriented representation. In the complex systems community, even variable-based decomposition methods are not widely utilized, but these state-based methods are still less widely known. This talk will compare state- and variable-based modeling, and will discuss open questions and research areas posed by this approach. 
View full article

Complexity and the Decomposability of Relations

Martin Zwick

A discrete multivariate relation, defined set-theoretically, is a subset of a cartesian product of sets which specify the possible values of a number of variables. Where three or more variables are involved, the highest order relation, namely the relation between all the variables, may or may not be decomposable without loss into sets of lower order relations which involve subsets of the variables. In a completely parallel manner, the highest order relation defined information-theoretically, namely the joint probability distribution involving all the variables, may or may not be decomposed without loss into lower-order distributions involving subsets of the variables. Decomposability analysis, also called "reconstructability analysis," is the specification of the losses suffered by all possible decompositions.

The decomposability of relations, defined either set- or information- theoretically, offers a fundamental approach to the idea of "complexity" and bears on all of the themes prominent in both the new and the old "sciences of complexity." Decomposability analysis gives precise meaning to the idea of structure, i.e., to the interrelationship between a whole and its parts, where these are conceived either statically or dynamically. It specifies the structuring and distribution and the amount of information needed to describe complex systems. It is partially predictive of chaotic versus non-chaotic dynamics in discrete dynamic systems. It provides a framework for characterizing processes of integration and differentiation which are involved in the diachronics of self-organization. 
View full article

Resolution of Local Inconsistency in Identification

Doug Anderson and Martin Zwick

This paper reports an algorithm for the resolution of local inconsistency in information-theoretic identification. This problem was first pointed out by Klir as an important research area in reconstructability analysis. Local inconsistency commonly arises when an attempt is made to integrate multiple data sources, i.e., contingency tables, which have differing common margins. For example, if one has an AB table and a BC table, the B margins obtained from the two tables may disagree. If the disagreement can be assigned to sampling error, then one can arrive at a compromise B margin, adjust the original AB and BC tables to this new B margin, and then obtain the integrated ABC table by the conventional maximum uncertainty solution.

The problem becomes more complicated when the common margins themselves have common margins. The algorithm is an iterative procedure which handles this complexity by sequentially resolving increasingly higher dimensional inconsistencies. The algorithm is justified theoretically by maximum likelihood arguments. It opens up the possibility of many new applications in information theoretic modeling and forecasting. One such application, involving transportation studies in the Portland area, will be briefly discussed. 
View full article

Structure and Dynamics of Cellular Automata

Martin Zwick and Hui Shu

Reconstructability analysis is a method to determine whether a multivariate relation, defined set- or information-theoretically, is decomposable with or without loss (reduction in constraint) into lower ordinality relations. Set-theoretic reconstructability analysis (SRA) is used to characterize the mappings of elementary cellular automata. The degree of lossless decomposition possible for each mapping is more effective than the lambda parameter (Walker & Ashby, Langton) as a predictor of chaotic dynamics. Complete SRA yields not only the simplest lossless structure but also a vector of losses of all decomposed structures. This vector subsumes lambda, Wuensche's Z parameter, and Walker & Ashby's "fluency" and "memory" parameters within a single framework, and is a strong but still imperfect predictor of the dynamics: less decomposable mappings more commonly produce chaos. The set-theoretic constraint losses are analogous to information distances in information-theoretic reconstructability analysis (IRA). IRA captures the same information as SRA, but allows lambda, fluency, and memory to be explicitly defined. 
View full article

Control Uniqueness in Reconstructability Analysis

Martin Zwick

When the reconstructability analysis of a directed system yields a structure in which a generated variable appears in more than one subsystem, information from all of the subsystems can be used in modeling the relationship between generating and generated variables. The conceptualization and procedure proposed here is discussed in relation to Klir's concept of control uniqueness. 
View full article [ pdf | postscript ]

Information-Theoretic Mask Analysis of Rainfall Time-Series Data

Martin Zwick, Hui Shu, and Roy Koch

This study explores an information-theoretic/log-linear approach to multivariate time series analysis. The method is applied to daily rainfall data (4 sites, 9 years), originally quantitative but here treated as dichotomous. The analysis ascertains which lagged variables are most predictive of future rainfall and how season can be optimally defined as an auxiliary predicting parameter. Call the rainfall variables at the four sites A...D, and collectively, Z, the lagged site variables at t-1, E...H, at t-2, I...L, etc., and the seasonal parameter, S. The best model, reducing the Shannon uncertainty, u(Z), by 22%, is HGFSJK Z, where the independent variables, H through K, are given in the order of their predictive power and S is dichotomous with unequal winter and summer lengths. 
View full article [ pdf | postscript ]

Set-Theoretic Reconstructability of Elementary Cellular Automata

Martin Zwick and Hui Shu

Set-theoretic reconstructability analysis is used to characterize the structures of the mappings of elementary cellular automata. The minimum complexity structure for each ECA mapping, indexed by parameter sigma, is more effective than the lambda parameter of Langton as a predictor of chaotic dynamics. 
View full article [ pdf | postscript ]

On Matching ANN Structure to Problem Domain Structure

George G. Lendaris, Martin Zwick and Karl Mathia

To achieve reduced training time and improved generalization with artificial neural networks (ANN, or NN), it is important to use a reduced complexity NN structure. A "problem" is defined by constraints among the variables describing it. If knowledge about these constraints could be obtained a priori, this could be used to reduce the complexity of the ANN before training it. Systems theory literature contains methods for determining and representing structural aspects of constrained data (these methods are herein called GSM, general systems method). The suggestion here is to use the GSM model of the given data as a pattern for modularizing a NN prior to training it. The present work assumes the GSM model for the given problem context has been determined (represented here in the form of Boolean functions of known decompositions). This means that certain information is available about constraint among the system variables, and is used to develop a modularized NN. The modularized NN and an equivalent general NN (full interconnect, feed-forward NN) are both trained on the same data. Various predictions are offered: 1) The general NN and the modularized NN will both learn the task, but the modularized NN will learn it faster. 2) If trained on an (appropriate) subset of possible inputs, the modularized NN will perform better generalization than the general NN. 3) If trained on a non-decomposable function of the same variables, the general NN will learn the task, but the modularized NN will not. All of these predictions are verified experimentally. Future work will explore more decomposition types and more general data types. 
View full article

An Information Theoretic Framework for Exploratory Multivariate Market Segmentation Research

Jamshid C. Hosseini, Robert R. Harmon and Martin Zwick

State-of-the-art market segmentation often involves simultaneous consideration of multiple and overlapping variables. These variables are studied to assess their relationships, select a subset of variables which best represent the subgroups (segments) within a market, and determine the likelihood of membership of a given individual in a particular segment. Such information, obtained in the exploratory phase of a multivariate market segmentation study, leads to the construction of more parsimonious models. These models have less stringent data requirements while facilitating substantive evaluation to aid marketing managers in formulating more effective targeting and positioning strategies within different market segments. This paper utilizes the information-theoretic (IT) approach to address several issues in multivariate market segmentation studies. A marketing data set analyzed previously is employed to examine the suitability and usefulness of the proposed approach [12]. Some useful extensions of the IT framework and its applications are also discussed. 
View full article

Image Reconstruction from Projections

Martin Zwick and E. Zeitler

Several methods of image reconstruction from projections are treated within a unified formal framework to demonstrate their common features and highlight their particular differences. This is done analytically (ignoring computational factors) for the following techniques: the Convolution method, Algebraic reconstruction, Back-projection and the Fourier-Bessel approach. 
View full article [ pdf | postscript ]


Artificial Life / Theoretical Biology

Levels of Altruism

Martin Zwick and Jeffrey A. Fletcher

The phenomenon of altruism extends from the biological realm to the human sociocultural realm. This paper sketches a coherent outline of multiple types of altruism of progressively increasing scope that span these two realms and are grounded in an ever-expanding sense of "self." Discussion of this framework notes difficulties associated with altruisms at different levels. It links scientific ideas about the evolution of cooperation and about hierarchical order to perennial philosophical and religious concerns. It offers a conceptual background for inquiry into societal challenges that call for altruistic behavior, especially the challenge of environmental and social sustainability.

View full article


The Evolution of Altruism: Game Theory in Multilevel Selection and Inclusive Fitness

Jeffrey A. Fletcher and Martin Zwick

Although the prisoner's dilemma (PD) has been used extensively to study reciprocal altruism, here we show that the n-player prisoner's dilemma (NPD) is also central to two other prominent theories of the evolution of altruism: inclusive fitness and multilevel selection. An NPD model captures the essential factors for the evolution of altruism directly in its parameters and integrates important aspects of these two theories such as Hamilton's rule, Simpson's paradox, and the Price covariance equation. The model also suggests a simple interpretation of the Price selection decomposition and an alternative decomposition that is symmetrical and complementary to it. In some situations this alternative shows the temporal changes in within- and between-group selection more clearly than the Price equation. In addition, we provide a new perspective on strong vs. weak altruism by identifying their different underlying game structures (based on absolute fitness) and showing how their evolutionary dynamics are nevertheless similar under selection (based on relative fitness). In contrast to conventional wisdom, the model shows that both strong and weak altruism can evolve in periodically formed random groups of non-conditional strategies if groups are multigenerational. An integrative approach based on the NPD helps unify different perspectives on the evolution of altruism. 
View full article

What's Wrong with Inclusive Fitness?

Jeffrey A. Fletcher, Martin Zwick, Michael Doebeli and David Sloan Wilson

No abstract available
View full article

Unifying the Theories of Inclusive Fitness and Reciprocal Altruism

Jeffrey A. Fletcher and Martin Zwick

Inclusive fitness and reciprocal altruism are widely thought to be distinct explanations for how altruism evolves. Here we show that they rely on the same underlying mechanism. We demonstrate this commonality by applying Hamilton's rule, normally associated with inclusive fitness, to two simple models of reciprocal altruism: one, an iterated prisoner's dilemma model with conditional behavior; the other, a mutualistic symbiosis model where two interacting species differ in conditional behaviors, fitness benefits, and costs. We employ Queller's generalization of Hamilton's rule because the traditional version of this rule does not apply when genotype and phenotype frequencies differ or when fitness effects are nonadditive, both of which are true in classic models of reciprocal altruism. Queller's equation is more general in that it applies to all situations covered by earlier versions of Hamilton's rule but also handles nonadditivity, conditional behavior, and lack of genetic similarity between altruists and recipients. Our results suggest changes to standard interpretations of Hamilton's rule that focus on kinship and indirect fitness. Despite being more than 20 years old, Queller's generalization of Hamilton's rule is not sufficiently appreciated, especially its implications for the unification of the theories of inclusive fitness and reciprocal altruism. 
View full article | View appendix

Strong Altruism Can Evolve in Randomly Formed Groups

Jeffrey A. Fletcher and Martin Zwick

Although the conditions under which altruistic behaviors evolve continue to be vigorously debated, there is general agreement that altruistic traits involving an absolute cost to altruists (strong altruism) cannot evolve when populations are structured with randomly formed groups. This conclusion implies that the evolution of such traits depends upon special environmental conditions or additional organismic capabilities that enable altruists to interact with each other more than would be expected with random grouping. Here we show, using both analytic and simulation results, that the positive assortment necessary for strong altruism to evolve does not require these additional mechanisms, but merely that randomly formed groups exist for more than one generation. Conditions favoring the selection of altruists, which are absent when random groups initially form, can naturally arise even after a single generation within groups-and even as the proportion of altruists simultaneously decreases. The gains made by altruists in a second generation within groups can more than compensate for the losses suffered in the first and in this way altruism can ratchet up to high levels. This is true even if altruism is initially rare, migration between groups allowed, homogeneous altruist groups prohibited, population growth restricted, or kin selection precluded. Until now random group formation models have neglected the significance of multigenerational groups-even though such groups are a central feature of classic "haystack" models of the evolution of altruism. We also explore the important role that stochasticity (effectively absent in the original infinite models) plays in the evolution of altruism. The fact that strong altruism can increase when groups are periodically and randomly formed suggests that altruism may evolve more readily and in simpler organisms than is generally appreciated. 
View full article

Hamilton's Rule Applied to Reciprocal Altruism

Jeffrey A. Fletcher and Martin Zwick

Reciprocal altruism and inclusive fitness are generally considered alternative mechanisms by which cooperative, altruistic traits may evolve. Here we demonstrate that very general versions of Hamilton's inclusive fitness rule (developed by Queller) can be applied to traditional reciprocal altruism models such as the iterated Prisoner's Dilemma. In this way we show that both mechanisms rely fundamentally on the same principle: the positive assortment of helping behaviors. We discuss barriers to this unified view, including phenotype/genotype differences and non-additive fitness (or utility) functions that are typical of reciprocal altruism models. We then demonstrate how Queller's versions of Hamilton's rule remove these obstacles. 
View full article

Reconstructability Analysis Detection of Optimal Gene Order in Genetic Algorithms

Martin Zwick and Stephen Shervais

The building block hypothesis implies that genetic algorithm efficiency will be improved if sets of genes that improve fitness through epistatic interaction are near to one another on the chromosome. We demonstrate this effect with a simple problem, and show that information-theoretic reconstructability analysis can be used to decide on optimal gene ordering. 
View full article

Altruism, the Prisoner's Dilemma, and the Components of Selection

Jeffrey A. Fletcher and Martin Zwick

The n-player prisoner's dilemma (PD) is a useful model of multilevel selection for altruistic traits. It highlights the non zero-sum interactions necessary for the evolution of altruism as well as the tension between individual and group-level selection. The parameters of the n-player PD can be directly related to the Price equation as well as to a useful alternative selection decomposition. Finally, the n-player PD emphasizes the expected equilibrium condition of mutual defection in the absence of higher levels of organization and selection. 
View full article

N-Player Prisoner's Dilemma in Multiple Groups: A Model of Multilevel Selection

Jeffrey A. Fletcher and Martin Zwick

Simulations of the n-player Prisoner's Dilemma (PD) in populations consisting of multiple groups reveal that Simpson's paradox (1951) can emerge in such game-theoretic situations. In Simpson's paradox, as manifest here, the relative proportion of cooperators can decrease in each separate group, while the proportion of cooperators in the total population can nonetheless increase, at least transiently. The increase of altruistic behavior exhibited in these simulations is not based on reciprocal altruism (Trivers 1971), as there are no strategies (e.g. Tit-for-Tat) conditional on other players' past actions, nor does it depend on kin selection via inclusive fitness (Hamilton 1964), as there are no genomes. This model is very general in that it can represent both biological and social non-zero sum situations in which utility (fitness) depends upon both individual and group behavior. The two parameters of the PD in this model, which determine the gain in individual utility for defection and the dependence of utility on collective cooperation, are respectively analogous to within-group and between-group selective forces in multilevel selection theory. This work is more fully described in Fletcher and Zwick (2000).

The notion that a system (group) does better when it achieves cooperation among its parts (individuals), often against the self-interest of those parts, goes beyond just biological systems undergoing natural selection. It is applicable to hierarchical systems across a variety of fields. The non-zero sum nature of aggregation is general and optimization by subsystems often results in sub-optimization at a higher level. The PD is often used to model such non-zero sum situations. Like Simpson's paradox, the PD involves an anomaly of composition: individually-rational strategies, when aggregated, give a deficient collective outcome.

As Sober and Wilson (1998) have demonstrated, Simpson's paradox (even if not always identified as such) is important in understanding multilevel selection. These authors show (pp. 18-26) that this paradox can be derived from simple fitness functions for altruists and non-altruists in two populations. These functions amount to an n-player PD (see Appendix A), although Sober and Wilson do not call attention to this fact. In this paper and in Fletcher and Zwick (2000), we make the connection between the PD and Simpson's paradox explicit. Our main finding is that Simpson's paradox emerges transiently, but for a wide range of PD parameter values, when a minimal group structure is imposed on an n-player PD. This result is produced in a model which involves an implicit competition between two groups and a simple n-player PD in each. The model is based on only two parameters which correlate with the within-group and between-group selection components in multilevel selection theory. 
View full article [ pdf | postscript ]

Simpson's Paradox Can Emerge from the N-Player Prisoner's Dilemma: Implications for the Evolution of Altruistic Behavior

Jeffrey A. Fletcher and Martin Zwick

Simulations of the n-player Prisoner's Dilemma in multiple populations reveal that Simpson's paradox can emerge in such game-theoretic situations. The relative proportion of cooperators can decrease in each separate sub-population, while the proportion of cooperators in the total population can nonetheless increase, at least transiently. Factors that determine the longevity of this effect are under investigation. The increase of altruistic behavior exhibited in these simulations is not based on reciprocal altruism, as there are no strategies conditional on other players' past actions, nor does it depend on kin selection via inclusive fitness, as there are no genes. This model is very general in that it can represent both biological and social non-zero sum situations in which utility (fitness) depends upon conditions at different hierarchical levels. The two parameters of the prisoner's dilemma in this model, which determine the gain in individual utility for defection and the dependence of utility on collective cooperation, are respectively analogous to within-group and between-group selective forces in multilevel selection theory. 
View full article [ pdf | postscript ]

Effect of Environmental Structure on Evolutionary Adaptation

Jeffrey A. Fletcher, Mark A. Bedau and Martin Zwick

This paper investigates how environmental structure, given the innate properties of a population, affects the degree to which this population can adapt to the environment. The model we explore involves simple agents in a 2-d world which can sense a local food distribution and, as specified by their genomes, move to a new location and ingest the food there. Adaptation in this model consists of improving the genomic sensorimotor mapping so as to maximally exploit the environmental resources. We vary environmental structure to see its specific effect on adaptive success. In our investigation, two properties of environmental structure, conditioned by the sensorimotor capacities of the agents, have emerged as significant factors in determining adaptive success: (1) the information content of the environment which quantifies the diversity of conditions sensed, and (2) the expected utility for optimal action. These correspond to the syntactic and pragmatic aspects of environmental information, respectively. We find that the ratio of expected utility to information content predicts adaptive success measured by population gain and information content alone predicts the fraction of ideal utility achieved. These quantitative methods and specific conclusions should aid in understanding the effects of environmental structure on evolutionary adaptation in a wide range of evolving systems, both artificial and natural. 
View full article [ pdf | postscript ]

Structure and Dynamics of Cellular Automata

Martin Zwick and Hui Shu

Reconstructability analysis is a method to determine whether a multivariate relation, defined set- or information-theoretically, is decomposable with or without loss (reduction in constraint) into lower ordinality relations. Set-theoretic reconstructability analysis (SRA) is used to characterize the mappings of elementary cellular automata. The degree of lossless decomposition possible for each mapping is more effective than the lambda parameter (Walker & Ashby, Langton) as a predictor of chaotic dynamics. Complete SRA yields not only the simplest lossless structure but also a vector of losses of all decomposed structures. This vector subsumes lambda, Wuensche's Z parameter, and Walker & Ashby's "fluency" and "memory" parameters within a single framework, and is a strong but still imperfect predictor of the dynamics: less decomposable mappings more commonly produce chaos. The set-theoretic constraint losses are analogous to information distances in information-theoretic reconstructability analysis (IRA). IRA captures the same information as SRA, but allows lambda, fluency, and memory to be explicitly defined. 
View full article

Dependence of Adaptability on Environmental Structure in a Simple Evolutionary Model

Jeff Fletcher, Martin Zwick and Mark Bedau.

This paper concerns the relationship between the detectable and useful structure in an environment and the degree to which a population can adapt to that environment. We explore the hypothesis that adaptability will depend unimodally on environmental variety, and we measure this component of environmental structure using the information-theoretic uncertainty (Shannon entropy) of detectable environmental conditions. We define adaptability as the degree to which a certain kind of population successfully adapts to a certain kind of environment, and we measure adaptability by comparing a population's size to the size of a non-adapting, but otherwise comparable, population in the same environment. We study the relationship between adaptability and environmental structure in an evolving artificial population of sensorimotor agents that live, reproduce, and die in a variety of environments. We find that adaptability does not show a unimodal dependence on environmental variety alone, although there is justification for preserving our unimodal hypothesis if we consider other aspects of environmental structure. In particular, adaptability depends not just on how much structural information is detectable in the environment, but also on how unambiguous and valuable this information is, i.e., whether the information accurately signals a difference that makes a difference. How best to measure and integrate these other components of environmental structure remains unresolved. 
View full article [ pdf with figures at the end | postscript without figures | zip archive of postscript with figures ]

Global Optimization Studies on the 1-D Phase Problem

Martin Zwick, Byrne Lovell and Jim Marsh

The Genetic Algorithm (GA) and Simulated Annealing (SA), two techniques for global optimization, were applied to a reduced (simplified) form of the phase problem (RPP) in computational crystallography. Results were compared with those of "enhanced pair flipping" (EPF), a more elaborate problem-specific algorithm incorporating local and global searches. Not surprisingly, EPF did better than the GA or SA approaches, but the existence of GA and SA techniques more advanced than those used in this study suggest that these techniques still hold promise for phase problem applications. The RPP is, furthermore, an excellent test problem for such global optimization methods. 
View full article [ pdf | postscript ]

Set-Theoretic Reconstructability of Elementary Cellular Automata

Martin Zwick and Hui Shu

Set-theoretic reconstructability analysis is used to characterize the structures of the mappings of elementary cellular automata. The minimum complexity structure for each ECA mapping, indexed by parameter sigma, is more effective than the lambda parameter of Langton as a predictor of chaotic dynamics. 
View full article [ pdf | postscript ]

Variance and Uncertainty Measures of Population Diversity Dynamics

Mark Bedau, Martin Zwick and Alan Bahm

We define variance and uncertainty measures of population diversity. Both measures have precise decompositions that we can exploit in analysis of evolutionary dynamics. We discuss how these measures are related and how they can be observed in artificial and natural evolving systems. 
View full article [ pdf | postcript ]

Diversity Dynamics in Static Resource Models

Mark Bedau, Matt Giger, Martin Zwick

We define three information-theoretic methods for measuring genetic diversity and compare the dynamics for these measures in simple evolutionary models consisting of a population of agents living, reproducing, and dying while competing for resources. The models are "static resource models," i.e., the distribution of resources is constant for all time. Simulation of these models shows that (i) focusing the diversity measures on used alleles and loci especially highlights the adaptive dynamics of diversity, and (ii) even though resources are static, the evolving interactions among the agents makes the effective environment for evolution dynamic. 
View full article [ pdf | postscript ]

Dynamics of Diversity in an Evolving Population

Mark Bedau, F. Ronneburg and Martin Zwick

We propose a family of measures of population diversity in evolving system, and observe the dynamics of these quantities in the context of a particular model--a two dimensional world with organisms competing for resources and evolving by changes in their movement strategy. We measure the dependence of diversity upon two model parameters: selection and mutation rate. 
View full article

Application of the Genetic Algorithm to a Simplified Form of the Phase Problem

B. Lovell and Martin Zwick

The Genetic Algorithm, a technique for global optimization which simulates evolutionary adaptation, is applied to a simplified form of the "phase problem" in theoretical crystallography. Results are compared with those of a problem-specific algorithm. 
View full article

Improving Crystallographic Macromolecular Images: The Real-Space Approach

A.D. Podjarny, T. N. Bhat and Martin Zwick

Macromolecular crystallographicy is a unique tool for imaging the structures of proteins and nucleic acids. Images are obtained from the Fourier transform of the diffraction pattern of the crystal by use of X-ray, neutron, and / or electron scattering. When X-ray and neutron scattering are used only diffraction amplitudes are experimentally measured, and phases have to be obtained. Multiple isomorphous replacement (MIR) has been the technique of choice for solving this phase problem in the determination of most macromolecular structures. Unfortunately, the method is extremely time consuming, especially when compared with the solution techniques available for small molecules; moreover, structure solution by MIR, even after many years of work, is hardly guaranteed. These drawbacks have stimulated efforts to enhance MIR as a phasing technique. The methods discussed in this paper (without exhaustive coverage, owing to space limitations) have so far been used to refine and/or to extend MIR phases, and also to open up the possibility of ab initio phase determination.

Following the early fundamental work of Karle & Hauptman (34, 35) and Sayre (60), reciprocal-space direct methods were applied to solve the structure of the majority of small molecules (via widely used packages, e.g. MULTAN and SHELX). These methods are used to derive phases statistically from the atomic character of the density. The extension of these methods to macromolecular crystallography is beyond the scope of this review.

Macromolecules present a more difficult problem. The diffraction data are rarely obtained at high enough resolution for the application of the atomicity constraint. Also, the accuracy of the phase predictions by reciprocal-space direct methods decreases with the size of the molecule. However, there are other a priori physical constraints applicable to macro-molecular density functions, e.g. continuity and solvent flatness. These constraints are more readily expressed in real space than in reciprocal space. Procedures that exploit such physical constraints in real space are commonly known as density modification (DM) methods. These techniques do not merely consist of real space imposition of a priori physical constraints, but also include reciprocal-space steps of comparable importance. These mixed real and reciprocal-space DM algorithms are the main subject of this review. 
View full article


Systems Theory and Philosophy

Systems Theory and the Metaphysics of Composition

Ideas from systems theory – recursive unity and emergent attributes – are applied to the metaphysical and meta-metaphysical debates about the ontological status of composites.  These ideas suggest the rejection of both extremes of universalism and nihilism, favoring instead the intermediate position that some composites exist in a non-trivial sense – those having unity and emergent novelty – while others do not.  Systems theory is egalitarian: it posits that what exist are systems, equal in their ontological status.  Some systems are fundamental, but what exists is not merely the fundamental, and the fundamental is not merely the foundational.  The status of composites raises non-trivial issues, but mereology – and metaphysics in general – would benefit from substantive interaction with scientifically interesting questions.
View Full Article 

Is the Materialist Neo-Darwinian Conception of Nature False?
Martin Zwick

This paper assesses the main argument of Thomas Nagel's recent book, Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False. The paper agrees with Nagel that, as an approach to the relation between mind and matter and the mystery of subjective experience, neutral monism is more likely to be true than either materialism or idealism. It disagrees with Nagel by favoring a version of neutral monism based on emergence rather than on a reductive pan-psychism. However, the paper invokes a reductive view when applied to information (as opposed to psyche), and posits a hierarchy of types of information than span the domains of matter, life, and mind. Subjective experience is emergent, but also continuous with informational phenomena at lower levels.
View full Article

Freedom as a Natural Phenomenon
Martin Zwick

This phenomenon of “freedom” in the natural world – and indirectly the question of free will – is explored using systems-theoretic concepts that link the idea of freedom to ideas about autonomy and agency.  The focus is on living systems in general, and on living systems that have cognitive subsystems more specifically.  After touching on the relevance to freedom of determinism vs. randomness, the paper examines four types of freedom: (i) independence from fixed materiality, (ii) activeness that is unblocked and holistic, (iii) internal rather than external determination, and (iv) regulation by an informational subsystem.  These types of freedom are not all-or-nothing but matters of degree.
View full Article

Complexity Theory and Political Change: Talcott Parsons Occupies Wall Street
Martin Zwick

Complexity theory can assist our understanding of social systems and social phenomena. This paper illustrates this assertion by linking Talcott Parsons' model of societal structure to the Occupy Wall Street movement. Parsons' model is used to organize ideas about the underlying causes of the recession that currently afflicts the US. While being too abstract to depict the immediate factors that precipitated this crisis, the model is employed to articulate the argument that vulnerability to this type of event results from flaws in societal structure. This implies that such crises can be avoided only if, in Parsons' terms, structural change occurs in the relations between polity, economy, community, and culture. The Occupy movement has called attention to the need for such fundamental change.
View full Article

Levels of Altruism

Martin Zwick and Jeffrey A. Fletcher

The phenomenon of altruism extends from the biological realm to the human sociocultural realm. This paper sketches a coherent outline of multiple types of altruism of progressively increasing scope that span these two realms and are grounded in an ever-expanding sense of "self." Discussion of this framework notes difficulties associated with altruisms at different levels. It links scientific ideas about the evolution of cooperation and about hierarchical order to perennial philosophical and religious concerns. It offers a conceptual background for inquiry into societal challenges that call for altruistic behavior, especially the challenge of environmental and social sustainability.

View full article


Personal Knowledge and the Human Sciences

This paper conceptualizes spiritual disciplines as sciences.  It uses this conceptualization to probe into the similarities and differences between modern science and religious tradition, and into the cultural significance and possible future impact of the "new religions." The paper draws upon the ideas of Michael Polanyi as a possible bridge between science and religion, and proposes that these ideas are relevant not only to the major Western religions, but to Eastern and non-mainstream Western religions as well. Imaging science as a spiritual path, or gnosis, would challenge an exclusivis understanding of scientific knowledge, and suggest the relevance of such knowledge to wisdom. Interpreting the spiritual disciplines as inner sciences might help strengthen and purify religious practic, and lead also to a critique of science and a new conception of its possibilities. These unconventional perspectives provide a novel basis for a dialogue between science and religion. However, since there are many differences between "inner" and "outer" research, the metaphor of spiritual disciplines as sciences is limited; if taken too literally, it will obscure more than it illuminates. 
View full article

Symbolic Structures as Systems: On the Near Isomorphism of Two Religious Symbols

Martin Zwick

Many symbolic structures used in religious and philosophical traditions are composed of “elements” and relations between elements. Similarities between such structures can be described using the systems theoretic idea of “isomorphism.” This paper demonstrates the existence of a near isomorphism between two symbolic structures: the Diagram of the Supreme Pole of Song Neo-Confucianism and the Kabbalistic Tree of medieval Jewish mysticism. These similarities are remarkable in the light of the many differences that exist between Chinese and Judaic thought, which also manifest in the two symbols. Intercultural influence might account for the similarities, but there is no historical evidence for such influence. An alternative explanation would attribute the similarities to the ubiquitousness of religious-philosophical ideas about hierarchy, polarity, and macrocosm-microcosm parallelism, but this does not adequately account for the similar overall structure of the symbols. The question of how to understand these similarities remains open.

This article is a longer version of the paper immediately below, originally published in Religion East and West, and also includes some systems-theoretic ideas. 
View full article

The Diagram of the Supreme Pole and the Kabbalistic Tree: On the Similarity of Two Symbolic Structures

Martin Zwick

This paper discusses similarities of both form and meaning between two symbolic structures: the Diagram of the Supreme Pole of Song Neo-Confucianism and the Kabbalistic Tree of medieval Jewish mysticism. These similarities are remarkable in the light of the many differences that exist between Chinese and Judaic thought, which also manifest in the two symbols. Intercultural influence might account for the similarities, but there is no historical evidence for such influence. An alternative explanation would attribute the similarities to the ubiquitousness of religious-philosophical ideas about hierarchy, polarity, and macrocosm-microcosm parallelism, but this does not adequately account for the similar overall structure of the symbols. The question of how to understand these similarities remains open. 
View full article

Holism and Human History

Martin Zwick

This paper uses a systems-theoretic model to structure an account of human history. According to the model, a process, after its beginning and early development, often reaches a critical stage where it encounters some limitation. If the limitation is overcome, development does not face a comparable challenge until a second critical juncture is reached, where obstacles to further advance are more severe. At the first juncture, continued development requires some complexity-managing innovation; at the second, it needs some event of systemic integration in which the old organizing principle of the process is replaced by a new principle. Overcoming the first blockage sometimes occurs via a secondary process that augments and blends with the primary process, and is subject in turn to its own developmental difficulties.

Applied to history the model joins together the materialism of Marx and the cultural emphasis of Toynbee and Jaspers. It describes human history as a triad of developmental processes which encounter points of difficulty. The ‘primary’ process began with the emergence of the human species, continued with the development of agriculture, and reached its first critical juncture after the rise of the great urban civilizations. Crises of disorder and complexity faced by these civilizations were eased by the religions and philosophies that emerged in the Axial period. These Axial traditions became the cultural cores of major world civilizations, their development constituting a ‘secondary’ process that merged with and enriched the first. This secondary process also eventually stalled, but in the West, the impasse was overcome by a ‘tertiary’ process: the emergence of humanism and secularism and – quintessentially – the development of science and technology. This third process blended with the first two in societal and religious change that ushered in what we call ‘modernity.’ Today, this third current of development falters, and inter-civilizational tension also afflicts the secondary stream. Much more seriously, the primary process has reached its second and critically hazardous juncture – the current global environmental-ecological crisis. System integration via a new organizing principle is needed on a planetary scale. 
View full article (html) (pdf) View slides of a talk on this article (pdf)

A Conversation on Theodicy

Martin Zwick

The author engages himself in a conversation on theodicy: the conundrum of how it can be that (a) Evil exists, and yet (b) God is beneficent, and (c) God is omnipotent. 
View full article [ pdf | html ]

Spinoza and Gödel: Causa Sui and Undecidable Truth

Martin Zwick

Spinoza distinguishes between causation that is external - as in A's causing B where A is external to B - and causation that is internal where C causes itself (causa sui) without any involvement of anything external to C. External causation is easy to understand, but self-causation is not. This note explores an approach to self-causation based upon Gödelian undecidability and draws upon ideas from an earlier study of Gödel's proof and the quantum measurement problem. 
View full article

Systems Metaphysics: A Bridge from Science to Religion

Martin Zwick

'Systems theory' is familiar to many as the scientific enterprise that includes the study of chaos, networks, and complex adaptive systems. It is less widely appreciated that the systems research program offers a world view that transcends the individual scientific disciplines. We do not live, as some argue, in a post-metaphysical age, but rather at a time when a new metaphysics is being constructed. This metaphysics is scientific and derives from graph theory, information theory, non-linear dynamics, decision theory, game theory, generalized evolution, and other transdisciplinary theories. These 'systems' theories focus on form and process, independent of materiality; they are thus relevant to both the natural and social sciences and even to the humanities and the arts. Concerned more with the complex than the very small or very large, they constitute a metaphysics that is centered in biology, and thus near rather than far from the human scale.

Systems metaphysics forges a unity of science based on what is general instead of what is fundamental; it is thus genuinely about everything. It counters the nihilism of narrow interpretations of science by affirming the link between fact and value and the reality of purpose and freedom in the natural world. It offers scientific knowledge that is individually useful as a source of insight, not merely societally useful as a source of technology. With the new world view that it brings, systems metaphysics contributes to the recovery of cultural coherence. It builds a philosophical bridge between science and religion that is informed by our understanding of living systems. It suggests a secular theodicy in which imperfection is lawful yet perfecting is always possible, and uses this perspective to analyze religions as systems. It provides scientific conceptions of traditional religious ideas and common ground for dialog between the world religions. 
View full article [ pdf | html ]

A Review of Systems: New Paradigms for the Human Sciences

Martin Zwick

This essay is a selective review of Systems: New Paradigms for the Human Sciences, edited by Gabriel Altmann and Walter A. Koch (Berlin: Walter de Gryter, 1998). It is selective because it is impossible to engage such a varied collection of systems-theoretic essays in a review of reasonable length. To invoke a relevant dialectical idea: the characteristic strength of any system is often also its characteristic weakness. One strength and weakness of the systems field is its great diversity, and this diversity is reflected in this volume by the range of subjects addressed in its 27 articles. I will not attempt what the editors themselves have declined to undertake, namely an integrating overview, nor will I offer brief remarks on many articles. Instead, I want to comment in detail on just three articles which bear on my own interests. I do not mean to suggest that these articles are more valuable or central to systems theory than the others.

After discussing the three articles, I will conclude by adding a few general remarks and by listing the authors and titles of the essays in the book, so that readers might be alerted to items of potential interest. From my study of the three articles I discuss and from my skimming of several other articles, I strongly recommend this book to systems researchers, especially researchers interested in the human sciences. 
View full article [ pdf | postscript ]

Understanding Imperfection

Martin Zwick

Because of their inherent abstraction, systems ideas are not themselves sufficient for gaining scientific knowledge or solving practical problems, but they can be a source of insights into the universality of imperfection, insights which can contribute to a new scientific world view. Systems theory offers a metaphysics, or more precisely an ontology, of imperfection. Through it, we can heed Spinoza's injunction, "Not to lament, not to curse, but to understand." 
View full article [ pdf | postscript ]

An Informal Review of The Crisis of Global Capitalism: A Letter to George Soros

Martin Zwick

Dear Mr. Soros,

I would like to bring to your attention some systems-theoretic ideas which are relevant to the point of view you present in The Crisis of Global Capitalism. From my perspective, your book, especially Part I: Conceptual Framework, is in both orientation and content an essay in systems theory. My connection to what you have written is still more direct. I'm working on a book which integrates systems ideas and theories around the theme of "imperfection." This is close to the centrality of “fallibility" in your framework, since “imperfection" is to ontology what “fallibility" is to epistemology.

I want to do three things in this letter: A. discuss the relationship between a focus on “fallibility" and one on “imperfection" (and argue for the latter); B. share some thoughts about the “open society" idea; and C. take up additional technical matters (beyond those covered in A and B). 
View full article

Complexity Theory and Systems Theory

Martin Zwick

I use the label, "complexity theory," for the research program which studies nonlinear dynamics, "complexity," "complex adaptive systems," "artificial life," etc., and whose intellectual Mecca in the United States is the Santa Fe Institute. I use the label, "systems theory," for the research program which crystallized after World War II under the names of "general systems theory" and "cybernetics," and which subsumed such postwar scientific developments as information theory, game theory, feedback control theory, and the beginnings of computer science and artificial intelligence. The central thesis of this paper is that complexity theory is a continuation and revitalization of systems theory. I demonstrate the validity of this assertion in two steps. First, I describe the essential properties of the research program of systems theory, so that the underlying unity in the diverse manifestations of this program is evident. Second, I show that complexity theory shares in these properties, and thus continues this research program. (While complexity theory is systems theory's predominant contemporary manifestation, the "classical" system tradition, more strongly and explicitly rooted in the aspirations and literatures of general systems theory and cybernetics, also continues.) To many people this assertion may be obvious, but from my discussions with researchers in systems theory or complexity theory and from my preliminary encounters with relevant work in the philosophy and sociology of science, this proposition is far from being even widely recognized, not to speak of being generally accepted. The paper makes extensive use of a characterization of systems theory made by Mario Bunge which applies equally well to complexity theory. Bunge described systems theory as an attempt to construct an "exact and scientific metaphysics." The attempt to construct such a metaphysics represents a fundamental rejection of the possibility and desirability of a sharp demarcation separating science and metaphysics. At the very least, metaphysics can serve as a heuristic for science, but systems theory holds out a more radical promise: the recovery of metaphysics via its scientific reconstitution. Such a metaphysics would be less abstract than mathematics but more abstract than the theories of specific scientific disciplines. It would be "stuff-free" (materiality-independent) and only "vicariously" testable. It would represent an attempt to develop a "theory of everything" on an altogether different basis than the way such theories are conceived of in theoretical physics. A systems theoretic TOE, were one available, would genuinely unify the sciences, and not merely offer the illusory unity of a cascade of promised inter-theoretic reductions all the way down to elementary particle physics. Of course, a systems theoretic TOE is not currently available, but ample materials for constructing one are already at hand.

Towards an Ontology of Problems

Martin Zwick

Systems theory offers a language in which one might formulate a metaphysics--or more specifically an ontology--of problems. This proposal is based upon a conception of systems theory shared by von Bertalanffy, Wiener, Boulding, Rapoport, Ashby, Klir, and others, and expressed succinctly by Bunge, who considered game theory, information theory, feedback control theory, and the like to be attempts to construct an "exact and scientific metaphysics.''

Our prevailing conceptions of "problems'' are concretized yet also fragmented and in fact dissolved by the standard reductionist model of science, which cannot provide a general framework for analysis. The idea of a "systems theory,'' however, suggests the possibility of an abstract and coherent account of the origin and essence of problems. Such an account would constitute a secular theodicy.

This claim is illustrated by examples from game theory, information processing, nonlinear dynamics, optimization, and other areas. It is not that systems theory requires as a matter of deductive necessity that problems exist, but it does reveal the universal and lawful character of many problems which do arise. 
View full article [ pdf | postscript ]

An Information Theory Approach to Measuring Industrial Diversification

Mohsen Attaran and Martin Zwick

The proposed entropy measure provides a flexible and analytically powerful measure of industrial diversity. The rectangular distribution (uniform distribution) of economic activities used as a comparative norm with the entropy measure is objective and conceptually consistent with the intuitive notion of diversification as the absence of concentration. Furthermore, the entropy measure can be decomposed to allow for identification of some important inter-industry diversification patterns which may not be at all apparent merely from examining the single-unit total entropy measure of diversity.

This technique was useful, not only in providing an overall index of diversity over time for the US but also, through its decomposition properties, in analyzing the nature of such a dispersal. The decomposition properties have permitted the analysis of economic concentration and structural changes, both within and between groups of sectors, which appeared to offer some useful extension of regional analysis.
View full article [pdf]

Entropy and Other Measures of Industrial Diversification

Mohsen Attaran and Martin Zwick

This study demonstrates that entropy is a useful measure for comparing industrial diversity either among regions or for a particular region over time. This measure allows not only examination of changes in diversity over time, but also, through its decomposition properties, an analysis of the nature of such changes.
For the purpose of illustration, employment diversity indices were computed using the entropy method for the state of Oregon from 1972 to 1984. The entropy measure was disaggregated into its between-set and within-set elements to express the extent and pattern of dispersal between and within different groups of industries.
View full article [pdf]

The Effect of Industrial Diversification on Employment and Income: A Case Study

Mohsen Attaran and Martin Zwick

One of the major outcomes of the depression of the thirties was a drive toward diversification of industrial activity in many areas of this country. Diversification become an important policy objective because of the belief that specialization was a dangerous liability that could lead to instability of income and periodic high unemployment. This study undertakes an investigation of the various aspects of economic diversity to determine whether support can be found for some of the generally held assumptions regarding its value. These assumptions are tested with data from the counties of Oregon for the ten-year period, 1972-1981. Such an investigation should provide insight into the patterns of growth and sources of cyclical instability of the units (counties) during the period of study. This, in turn, may offer both a conceptual and an historical perspective for decision-makers responsible for forumulating policies for economic recovery.
View full article [pdf]

Incompleteness, Negation, Hazard: On The Precariousness Of Systems

Martin Zwick

An account is offered of the dialectical tensions which afflict systems of widely differing type, "contradictions" which cannot be fully or permanently resolved, and from which follow the lawfulness of both hazard and opportunity.

INTRODUCTION

Mario Bunge (1973) has provided a deep and succinct characterization of systems and cybernetics theories, e.g., information theory, game theory, automata theory, and the like, as attempts to construct an exact and scientific metaphysics. These theories can be considered "metaphysical" in their generality, "exact" in being mathematical, and "scientific" in having a close connection with specific theories in one or more scientific disciplines. This view is fundamentally in close agreement with the goals of general systems theory and/or cybernetics as expressed by Boulding, von Bertalanffy, Wiener, Ashby, and others.

This paper develops the outlines of a metaphysics of "problems," an account of the nature and origin of those difficulties which afflict many different kinds of systems, difficulties which reflect contradictions* intrinsic to being and to becoming which can never be completely resolved. Such difficulties are lawful and ubiquitous. This analysis serves as a necessary corrective to the tendency of systems thought to assume or to overemphasize the stability and internal harmony of systems and to neglect dysfunction, conflict, and change. What is outlined here is an entity-based metaphysics which takes the existence of entities to be intrinsically precarious.

This essay is a synthetic effort, and constraints of space make it impossible to "unpack" the technical and philosophical allusions of the narrative. An expanded version of this paper which details specific connections to the sources listed in the bibliography and to other works in the systems literature will be published elsewhere. The present text seeks to demonstrate that a coherent ontology is implicit in systems-theoretic ideas by casting these ideas into the form of a metaphysical discourse.

* The word "contradiction" is used in its dialectical and not logical meaning, i.e., to denote the coexistence of opposing forces, needs, tendencies, etc. No distinction is made in this paper between "concrete" and "conceptual" systems. Emphasis on the former is intended, and terms which properly belong only to the domain of the latter are used metaphorically.
View full article [pdf | postscript

Information, Constraint and Meaning

Martin Zwick

Despite the familiar and correct disclaimer that information theory (Shannon and Weaver, 1949) does not concern the semantic level of communication, the technical definition of the information nonetheless bears directly and importantly on the subject of meaning. Meaning, at least in one sense of the word, is the recognition of the constraint and is based on isomorphism of structure. Constraint reduces information, yet information is also very substrate of meaning. Meaning is thus the union of the informative and the intelligible (Moles, 1958), the reconciliation of this dialectic opposition being achievable in several different ways. 
View full article [ pdf | postscript ]

Entropy Measures in Input-Output Analysis

Martin Zwick and Abbas Heiat

Applications of Shannon's entropy measure to the matrices of technical and interdependence coefficients, to the final demand vector, and to other aspects of input-output tables are proposed. These entropy measures serve as indices of different types of economic diversity. The relevance of such indices for economic planning and for analyses of economic structural complexity and development is discussed.
View full article [pdf]

Some Analogies of Hierarchical Order in Biology and Linguistics

Martin Zwick

The ubiquity of hierarchical order is obvious, and the obvious is hard to explain, but a number of workers [1] have suggested the possibility of constructing a theory (or cluster of theories), rooted in such disciplines as thermodynamics, information theory, topology, and logic, which might reveal the underlying unity of a wide variety of branching and multi-level systems. It is the purpose of this paper to contribute to both the empirical and theoretical aspects of this discussion, by examining levels of structure and function in molecular biology and linguistics, and by developing, from parallelisms between these two areas, a hierarchical model of possibly greater generality. 
View full article [ pdf | postscript ]

Dialectics and Catastrophe

Martin Zwick

The Catastrophe Theory of René Thom and E. C. Zeeman' suggests a mathematical interpretation of certain aspects of Hegelian and Marxist dialectics. Specifically, the three ‘classical’ dialectical principles, (1) the transformation of quantity into quality, (2) the unity and struggle of opposites, and (3) the negation of negation, can be modeled with the seven ‘elementary catastrophes’ given by Thom, especially the catastrophes known as the ‘cusp’ and the ‘butterfly’. Far from being empty metaphysics or scholasticism, as critics have argued, the dialectical principles embody genuine insights into a class of phenomena, insights which can now be expressed within a precise mathematical formalism. This fact does not, however, support the claim that these principles, possibly modified or supplemented, constitute the laws of motion for human thought and for natural and social processes - or even just the last of these. 
View full article (pdf)

Requisite Variety and the Second Law

Martin Zwick

Although the Law of Requisite Variety (LRV) speaks directly about entropy (of a set of disturbances to a system, and of the states and effects of a regulator), the relation of Ashby's principle to the Second Law of Thermodynamics does not appear to have been commented on. In this paper, it is shown that, when regulation is viewed as a temporal process, the LRV can be interpreted as a statement of, and, in fact, a consequence of, the Second Law. In essence, the regulator reduces the variety (entropy) of the system being regulated by a compensatory increase of variety (entropy) within itself. The total change of entropy in regulator plus system cannot, however, be negative.

Yet, while the LRV is a statement of the Second Law, it is one which casts the classical interpretations of the concepts of entropy and neg-entropy in a new light. Specifically, the LRV appears as a principle opposite, or more precisely, complimentary to what might be called the "neg-entropy principle" of Schrödinger, Bertalanffy, and others. These two principles set out alternative strategies for survival for an open system. To counter the tendency of internal order to degrade, a system may ingest neg-entropy from and/or excrete entropy into its surroundings (Schrödinger, et al). Or it may reduce entropy by shifting it, as it were, to a regulator subsystem (Ashby). Entropy has both "negative" and "positive" attributes--disorder and variety, respectively; so, too, has neg-entropy, which can imply rigidity as well as order. 
View full article

Quantum Measurement and Gödel's Proof

Martin Zwick

The measurement problem in quantum mechanics has the character of a fundamental incompleteness within that theory similar to the incompleteness of the axiomatic systems in mathematics, discovered and elaborated by Gödel and others. The difficulty of describing the measurement process by the time-dependent Schrödinger equation may reflect the limitations of formal language, and quantum theory may thus require a formalism consisting of two levels of description, one for the dynamics and one for measurement, levels whose relationship resembles that of a calculus and meta-calculus. 
View full article [ pdf | postscript ]

Fuzziness and Catastrophe

Martin Zwick, Daniel G. Schwartz, and George G. Lendaris

In an recent short note, Flondor has alluded to a possible linkage of fuzzy set theory and catastrophe theory. We consider several features of catastrophe theory, namely the properties of discontinuous jumps, hysteresis, and divergence in the "cusp catastrophe," and the role of the bias factor in the "butterfly catastrophe", which have affinities to and suggest possible extensions of fuzzy set ideas. Certain functions extensively considered in catastrophe theory lend themselves in some cases to interpretation as membership functions. The use of such functions may be of interest for the characterization of linguistic descriptions which are time-varying and encompass both discrete and fuzzy distinctions. 
View full article