Search Google Appliance


Seminar Archive 2010

 

 

 


 

FALL 2010 SCHEDULE



DATE: December 3, 2010

LOCATION: Harder House, Room 104

PRESENTER: Christof Teuscher

TITLE: "Random automata networks: why playing dice is not a vice"

ABSTRACT: Random automata networks consist of a set of simple compute nodes interacting with each other. In this generic model, one or multiple model parameters, such as the the node interactions and/or the compute functions, are chosen at random. Random Boolean Networks (RBNs) are a particular case of discrete dynamical automata networks where both time and states are discrete. While traditional RBNs are generally credited to Stuart Kauffman (1969), who introduced them as simplified models of gene regulation, Alan Turing proposed unorganized machines as early as 1948. In this talk I will start with Alan Turing's early work on unorganized machines, which form a subset of RBNs. I will show how Turing's original ideas tie into our current research, which is mainly driven by the need to (1) understand and engineer information processing in unstructured machines and (2) by the need for alternative computing and manufacturing paradigms in computer engineering. I will then give an overview of our recent research findings in the area of learning, adaptation, generalization, and damage spreading in RBNs.

BIO: Christof Teuscher currently holds an assistant professor position in the Department of Electrical and Computer Engineering (ECE) with joint appointments in the Department of Computer Science and the Systems Science Graduate Program. He also holds an Adjunct Assistant Professor appointment in Computer Science at the University of New Mexico (UNM). Dr. Teuscher obtained his M.Sc. and Ph.D. degree in computer science from the Swiss Federal Institute of Technology in Lausanne (EPFL) in 2000 and 2004 respectively. His main research interests include emerging computing architectures and paradigms, biologically-inspired computing, complex & adaptive systems, and cognitive science. Teuscher has received several prestigious awards and fellowships. For more information visit: http://www.teuscher-lab.com/christof

[return to top]

 


 

DATE: November 19, 2010

LOCATION: Harder House, Room 104

PRESENTER: M. Jahi Chappell

TITLE: "The 'Goldilocks Hypothesis' — a political ecology of the land-sparing/wildlife-friendly farming debate"

ABSTRACT: Proposals for biodiversity conservation as related to the dominant form of human land use, agriculture, have broadly coalesced around two paradigms: "Land sparing" and "Wildlife-friendly farming." Neither paradigm is sufficiently grounded in the more complex socioeconomic realities of the food system with regards to another paramount problem of our time: widespread malnutrition. However, the "land sparing" paradigm's simplistic approach to food, policy, and ecosystem dynamics is arguably more egregiously out of sync with current knowledge. The talk will present a conceptual view of food systems, hunger, and biodiversity conservation, with the goal of generating discussion on how to systematically integrate the different and sometimes clashing perspectives different academic areas bring to this debate. Without more sophisticated, integrative, and value-explicit assessments and models, and the recognition of the need for discontinuous change, we risk intensifying a system known to have severe consequences for biodiversity alongside notable failures to reduce hunger, or alternatively, modifying it in ways that ameliorate neither problem.

BIO: Jahi Chappell is assistant professor of Environmental Science and Justice in the School of Earth and Environmental Sciences and faculty affiliate of the Center for Social and Environmental Justice at Washington State University Vancouver. His research on the political ecology of sustainable development focuses on the issues of food security, agroecology, conservation biology, and social and environmental justice. Practically speaking, this involves study of the design, development and implementation of food and conservation policies at various scales, with a specific focus on how one may influence (and hopefully support) the other. To do this, he applies tools from diverse areas, from political science, sociology, anthropology, science and technology studies and economics to metapopulation theory, theoretical biology, agroecology, and conservation and community ecology. Rather than attempting to specialize in all of these areas simultaneously, he specializes in synthesizing their approaches and research and on making their perspectives mutually intelligible. His work thus vitally depends on the cultivation and maintenance of collaboration with a diverse group of scholars and practitioners.

His research to date has focused on the ecological implications of the unprecedented successes of the city of Belo Horizonte, Brazil in addressing food security. He plans to continue and expand his research of Brazilian food policy systems, as well as to begin study on the effects and implications of urban agriculture on food security and biodiversity in the Portland-Vancouver area.

[return to top]

 


 

DATE: November 12, 2010

LOCATION: Harder House, Room 104

PRESENTER: Kjersten Bunker Whittington

TITLE: "The influence of network structure on sex disparities in scientific collaboration: commercial innovation in the life sciences"

ABSTRACT: Previous research demonstrates that individuals’ network positions in their surrounding social structure of relations influence the extent of their output and performance. The unique situation of minority groups complicates the relationship, however, as issues of status, legitimacy, and marginality influence the flow and interpretation of information and resources. While several scholars have addressed differences in male and female networks in the workplace, the association between macro-level work arrangements and the micro-level interaction mechanisms of minority groups is unclear. Greater insight into stratification processes can be gained by studying how organizational forms affect the way men’s and women’s networks are structured in the workplace. In this research I explore how the contrasting contexts of work in hierarchical versus horizontal settings operate at the network level to produce differences in productivity between male and female workers. I examine twenty years of collaborative inventor relations, built from a national sample of life science organizations which include pharmaceutical companies, public research organizations, research one universities, and science-based firms. The results show that men and women scientists demonstrate different network characteristics, but the magnitude and predictive power of these differences vary across work settings. The results have implications for structural influences on sex differences in network relationships, and provide evidence that flatter, more horizontally-distributed organizational forms may provide more advantaging “opportunity structures” for women life scientists as compared with those in the academic science hierarchy and elsewhere.

BIO: Kjersten Bunker Whittington is an assistant professor of sociology at Reed College. Her current research investigates whether and how the durable gender inequality in science careers is affected by the recent changing boundaries between universities and firms, and the increasing trend to commercialize basic research in academia. Whittington also studies formal organizations and the science economy. With collaborators, she is engaged in research that examines the contingent role of inter-organizational network structure and regional clustering in influencing innovative output among science-based firms.

[return to top]

 


 

DATE: November 5, 2010

LOCATION: Harder House, Room 104

FACILITATOR: Joshua Hughes

TOPIC: "The limits of control, or how I learned to stop worrying and love regulation"

ABSTRACT: When we want to solve a problem, we talk about how we might manage or regulate—control it. Control is a a central concept in systems science, along with system, environment, utility, and information. With his information-theoretic Law of Requisite Variety, Ashby proved that to control a system we need as much variability in our regulator as we have in our system (“only variety can destroy variety”), something like a method of control for everything we want to control. For engineered systems, this appears to be the case (at least sometimes). But what about for social systems? Does a group of humans behave with the same level of variability as a machine? Not usually. And when control is applied to a human system, in the form of a new law or regulation, individuals within it may deliberately change their behavior. A machine's behavior may also change when a control is applied to it—think of how emissions equipment affects the performance of an automobile (less pollution, but less power too)—but the machine doesn't (typically) adapt. People do. Does this pose a difficulty if we want to employ Ashby's law to solve a control problem in a human system? Or could our ability to adapt provide an advantage?

Ashby acknowledged that for very large systems regulation is more difficult, and many social systems are very large. With limited resources we may not be able to control for all the variety and possible disturbances in a very large system, and therefore we must make choices. We can leave a system unregulated; we can reduce the amount of the system we want to control; we can increase control over certain forms of variety and disturbances; or we could find constraint or structure in the system's variety and disturbances—in other words, create better, more accurate models of our system and its environment.

Creating better models has always been a driving force in the development of systems science. Conant and Ashby proved that “every good regulator of a system must be a model of that system” in a paper of the same name. Intuitively this makes sense: if we have a better understanding of the system—a better model—we should be better able to control the system. But how well are we able to able to model human systems? For example, how well do we model intersections? Think about your experience in a car or on a bike at a downtown intersection during rush hour. Now think about that same intersection from the perspective of a pedestrian late in the evening. Did the traffic signals control the intersection in an efficient manner under both conditions? What if we consider all the downtown intersections, or the entire Portland-area traffic system? What about even larger systems? How well can we model the U.S. health care system? What is the chance that in a few thousand pages of new controls a few of them will cause some unforeseen consequence? How well do we understand the economy? Enough to create a law limiting CEO compensation? Might just one seemingly straightforward control lead to something unforeseen?!

So what level of understanding must we have of a system, i.e., how well must we be able to model it, before we regulate it? We must still react to and manage, as best we can, a man-made or natural disaster, even when we may know very little about it at the start. Our ability to adapt is critical in these situations. But at the same time, with our ability to adapt we can also (with the proper resources) circumvent the intent of regulations or use regulations to protect or increase our influence: consider “loopholes” in the tax code or legislation with which large corporations can easily comply but causes great difficulties for smaller businesses.

No matter what problem we have, it's important to understand what limits our ability to control and how controls may cause new and different problems; this will be the general focus of this seminar. A brief overview of Ashby's Law of Requisite Variety, along with a conceptual example, will be presented.

BIO: Joshua Hughes is a third-year, core-option Ph.D. student and graduate assistant in the PSU Systems Science Graduate Program. He is working on research with George Lendaris on contextual reinforcement learning and experience-based identification and control, and he has recently collaborated with Martin Zwick on a paper showing how the panarchy adaptive cycle can be formalized using the cusp catastrophe. He is interested in information theory, cybernetics, reconstructability analysis, neural networks, fuzzy logic, catastrophe theory, game theory, and many other things.

[return to top]

 


 

DATE: October 29, 2010

LOCATION: Harder House, Room 104

PRESENTER: Alexander Dimitrov

TITLE: "Neural systems analysis through quantization with an information-based distortion function"

ABSTRACT: Methods based on Rate Distortion theory have been successfully used to cluster stimuli and neural responses in order to study neural codes at a level of detail supported by the amount of available data. They approximate the joint stimulus-response distribution by quantizing paired stimulus-response observations into smaller reproductions of the stimulus and response spaces. An optimal quantization is found by maximizing an information-theoretic cost function subject to both equality and inequality constraints, in hundreds to thousands of dimensions. This analytical approach has several advantages over other current approaches:

  • it yields the most informative approximation of the encoding scheme given the available data (i.e., it gives the lowest distortion, by preserving the most mutual information between stimulus and response classes),
  • the cost function, which is intrinsic to the problem, does not introduce implicit assumptions about the nature or linearity of the encoding scheme,
  • the maximum entropy quantizer does not introduce additional implicit constraints to the problem,
  • it incorporates an objective, quantitative scheme for refining the codebook as more stimulus/response data becomes available,
  • it does not need repetitions of the stimulus under mild continuity assumptions, so the stimulus space may be investigated more thoroughly.

Here the method is applied to the study of neural sensory representation. The application of this approach to the analysis of biological sensory coding involved a further restriction of the space of allowed quantizers to a smaller family of parametric distributions. We show that, for some cells in this system, a significant amount of information is encoded in patterns of spikes that would not be discovered through analyses based on linear stimulus-response measures.

BIO: Alex Dimitrov's main research interests involve the study of neural information processing, neural coding and information representation in biological systems using branches of applied probability (information theory, signal processing theory, multivariate statistics, stochastic differential equations), dynamical systems theory. group theory, optimization, operations research, and differential geometry. His current research concentrates on three basic aspects related to these issues: developing analytical tools and quantitative approaches to characterizing the neural representation of sensory stimuli; studying the statistical properties of natural sensory signals and their relations to biological sensory systems; and studying structure/function relations in biophysical models of neural systems. These research directions are flexible and are easily adaptable to new collaborations and research environments.

[return to top]

 


 

DATE: October 22, 2010

LOCATION: Harder House, Room 104

PRESENTER: Robert Costanza

TITLE: "Understanding, modeling and valuing ecosystem services"

ABSTRACT: Ecosystem services (ES) are the direct and indirect contributions of ecosystems (in combination with other inputs) to human well-being. An ES-based approach can assess the trade-offs inherent in managing humans embedded in ecological systems. Evaluating trade-offs requires both an understanding of the biophysical magnitudes of ES changes that result from human actions, as well as an understanding of their impact on human well-being, broadly conceived. This talk discusses the state of the art of ES assessment, valuation, and modeling, including the potential of integrated ecological economic modeling. Valuation is about assessing trade-offs – not necessarily about trades (exchanges) in markets for money. Since ecosystem services are largely public goods, market exchanges are not (and should not be) present. This does not mean that trade-offs are not present. Conversely, expressing trade-offs in money does not imply that market exchanges are possible or desirable. Finally, the appropriate uses of economic incentives in managing ecosystem services are discussed. Because many ecosystem services are public goods (non-rival and non-excludable) they cannot (or should not) be privatized – a prerequisite for trading in conventional markets. The solution is to recognize the value of these public goods and modify market and other incentives to communicate that value to private decision-makers. Systems such as ecological taxes and subsidies, government mediated systems of payment for ecosystem services (PES - like the system in Costa Rica), and common asset trusts are some of the tools that are useful for incorporating the value of ecosystem services into private decision-making.

BIO: Dr. Costanza's research has focused on the interface between ecological and economic systems, particularly at larger temporal and spatial scales. This includes landscape-level spatial simulation modeling; analysis of energy and material flows through economic and ecological systems; valuation of ecosystem services, biodiversity, and natural capital; and the analysis and correction of dysfunctional incentive systems.

He is the author or co-author of over 400 scientific papers and 22 books; and his work has been cited in more than 6,000 scientific articles. Reports on his work have appeared in several outlets including Newsweek, Time, The Economist, The New York Times, Science, Nature, National Geographic, and National Public Radio.

[return to top]

 


 

DATE: October 15, 2010

LOCATION: Harder House, Room 104

PRESENTER: Herman Migliore

TITLE: "How a system engineer starts..."

ABSTRACT: Dr. Migliore will review systems engineering as a process for developing products, processes, and services and suggest views that encourage systems thinking. As an example, he will focus on the beginning of the development process, the fuzzy front end, and discuss a method, ConOps, for getting started using examples from PSU's masters program.

BIO: Herman Migliore has nearly forty years experience in engineering design, application of computational mechanics in design, development of design methodologies, and design education. Since 1997, he has been director of systems engineering at Portland State University, an online masters program intended for experienced, practicing engineers. As director, he has participated in many projects that apply systems engineering to a wide variety of areas for small, medium, and large industry and government sponsors.

[return to top]

 


 

DATE: October 8, 2010

LOCATION: Harder House, Room 104

PRESENTER: Wayne Wakeland

TITLE: A systems model of prescription opioid abuse, addiction, and overdose

ABSTRACT: A dramatic rise in the use pharmaceutical opioids to treat pain, and the associated opioid abuse and addiction, has created a substantial public health problem in the United States. Effective tools and interventions are needed to identify policies to reduce opioid abuse, addiction, and overdose deaths. A system dynamics model is used to identify policy interventions that will reduce the prevalence of adverse outcomes attributed to pharmaceutical opioids. Results suggest that it will be difficult to minimize negative outcomes without adversely affecting the degree to which chronic pain patients can access pharmaceutical treatment, and also indicate the importance of the metric(s) chosen for evaluating effectiveness.

BIO: Wayne Wakeland earned a B.S. in Engineering and a Master of Engineering from Harvey Mudd College in 1973, and a Ph.D. in Systems Science from PSU in 1977. Wayne began a career in industry, and taught computer modeling and simulation courses at PSU in the evening. Eventually, Wayne became an Associate Professor of Systems Science at PSU with a continued focus on computer simulation methods. His research emphasizes sustainable systems and management, health system dynamics, fishery dynamics, criminal justice system simulation, and biomedical dynamics. Since 2007, Wayne has also taught systems thinking at the Bainbridge Graduate Institute.

[return to top]

 


 

 

 

SPRING 2010 SCHEDULE

 



DATE: May 28, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Yiannis Laouris

TITLE: "Systems science strikes back; absolutely essential in our struggle to respond to 21st century's complex societal and technological challenges"

ABSTRACT: In this seminar, I will try to share my views concerning the role and responsibility of systems and complexity science(s) in a world that is becoming increasingly complex and unstable.

My team pioneers in the application of a systems science branch known as Structured Dialogic Design process (SDDSM). It has been tested in the Cyprus peace movement and in many pan-European networks. Together with Dr. Aleco Christakis’s group, we are currently exploring how to scale up the SDDSM process to accelerate social change. The most recent experiments involve: (1) an island-wide project in Cyprus with 10 almost parallel SDDSM dialogues (with the participation of about 300 elected representatives) aiming to reform local governance. (2) Three pan-European dialogues, with participation of a wide range of stakeholders, aim to identify the most influential research and industrial domains under Budget Line Challenge 7: ICT for Independent Living (Inclusion and Governance; Accessible and Assistive ICT; Embedded Accessibility of Future ICT). In the first case, the results will be used to design nation-wide training programs to materialize whatever was discovered as “need” in the dialogues. In the second case, the European Commission will use the results to define the priority areas for the next Framework Program calls.

Future challenges include: (1) Further development of the science; (2) Refinement of the complexity index; (3) Scaling-up to enable synchronous participation of 100 – 1000 people; (3) Modeling multiple minds collaborate to solve complex problems while achieving a shared understanding concerning a complex problem and being mobilized to work collaboratively towards its resolution.

Depending on time constrains, my research interests at the interface of systems science and brain sciences, networking sciences and socio-techno systems will be discussed.

In the last part of the seminar, I would like to open a dialogue and share ideas how PSU’s Systems Science program can evolve, mobilize resources, increase its impact and become an international leader.

BIO: Yiannis Laouris is Senior Scientist and Chair of the Cyprus Neuroscience-Technology Institute (CNTI), which employs about 20 full-time scientists and currently implements more than 15 Europe-wide projects (as Coordinator). CNTI focuses at the interface of science and society. He is Director of CyberEthics (Safer Internet Awareness Node and Hotline), National Representative for various COST Actions (276: Information & Knowledge Management for Integrated Media Communication; 219ter: Accessibility for All to Services and Terminals for Next Generation Networks; 2102: Cross-Modal Analysis of Verbal and Non-verbal Communication).

Laouris was born in Cyprus in 1958. He is medical graduate of the Leipzig University, Germany (“very good”), completed a PhD in Neurophysiology at the Karl-Ludwig Institute (summa cum laude) and an MS in Systems and Industrial Engineering (GPA 4.0) at the University of Arizona. Together with cyberneticians/systems physiologists Schwartze, Henatsch, Windhorst and Stuart, for over fifteen years, he applied linear/non-linear digital processing to biological signals from experimental animals to study brain signals. Almost twenty years ago, he has taken a life’s decision to partly interrupt his academic career and engage in socially responsible projects that contribute towards positive social change. In the 90’s, he founded CYBERKIDS (international chain of computer learning centrers), which used a systemic approach to “transcend” a country’s educational and political life and move the new generation a decade ahead. Its curriculum (new learning theory based on an educationally relevant and socially responsible approach) received seven international awards for innovation and social responsibility.

Laouris has about fifty papers in peered reviewed journals, half of which in neuroscience, a quarter in applied systems science and peace, and the rest in IT-children and neuroscience of learning. He contributed chapters in about twenty books and made over 120 conference papers and presentations. He mainly publishes in Brain Research, Experimental Brain Research, Neuroscience, Journal of Neurophysiology, Behavioural and Brain Sciences, World Futures, Int. J. Applied Systemic Studies and Systemic Practice and Action Research.

[return to top]


 

 

DATE: May 21, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Cliff Joslyn

TITLE:"Hierarchical Systems Theory"

ABSTRACT: Systems Science builds on a number of foundational concepts (for example, "order", "organization", "complexity", "emergence", and "control") to elucidate an interdisciplinary view of systems of different types. "Hierarchy" is an example of a particularly important such concept, as it underlies virtually all schemes for understanding the management or evolution of complexity, through hierarchical relationships between phenomena at different temporal or spatial scales, or at different levels of aggregation.

Systems scientists commonly invoke levels of gradation, hierarchy, and related ideas in their models. And certainly hierarchical trees, and sometimes lattices, are common structures in formal modeling in general. But hierarchy itself is rarely the focus of explicit mathematical representation, nor are adequate means to represent and measure hierarchical phenomena in the abstract regularly used.

There is, in fact, a mathematical theory of hierarchy, a branch of combinatorics called "order theory" or "lattice theory". It provides concrete representations of levels of hierarchical structure, tools to measure and manipulate them, and a complete sense of general hierarchy beyond simple trees.

Ordered structures (especially lattices) have been long studied in combinatorics and general algebra, and some restricted aspects have been applied in a few particular areas of physics and computer science. But as I set out a number of years ago to try to use hierarchical systems theory to mine large semantic taxonomies which underly formal ontologies (as used in the modern Semantic Web movement), some very simple surprises revealed themselves.

In this talk we will review these concepts in the context of my work in the development of a metric space of hierarchical relations, and of their application in the computational semiotics of ontology alignment and clustering on the one hand, and information-theoretical view discovery in multidimensional databases on the other.

BIO: Cliff Joslyn is the Chief Scientist for Knowledge Sciences in the National Security Directorate of the Pacific Northwest National Laboratory. He has previously been Team Leader for Knowledge and Information Systems Sciences in the Computer Science Division at the Los Alamos National Laboratory from 1996-2007, and an NRC Research Associate at NASA’s Goddard Space Flight Center from 1994-1996. Dr. Joslyn holds a BA with High Honors in Cognitive Science and Mathematics from Oberlin College, and an MS and PhD in Systems Science from SUNY Binghamton. As an interdisciplinary information scientist and mathematical systems theorist, Dr. Joslyn's research interests include applied order theory, knowledge discovery in databases, theoretical cybernetics, ontology management and knowledge systems, computational semiotics, qualitative modeling and simulation, and generalized information theory. He has been active as a leader and researcher in many government computer science research efforts in such areas as lattice theoretical approaches to the integration of large semantic hierarchies for homeland defense, scenario modeling for threat anticipation, knowledge representation for bioinformatics, multidimensional database analysis for law enforcement, qualitative modeling for spacecraft diagnosis, knowledge systems development for integrated modeling for nuclear non-proliferation, and generalized information theory for engineering decision support and risk analysis.

[return to top]


 

 

DATE: May 14, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Vivek Shandas

TITLE: "The hydro-ecology of everyday life: assessing the social and environmental determinants of water use in the Portland region"

ABSTRACT: Driven in part by the imminent threats of population growth and climate destabilization, recent studies suggest that urban areas face severe water scarcity, with some areas in Australia and the United States already instituting moratoria on water use. While water managers traditionally avoid such crises by developing demand forecasts based on population estimates, technological developments, and weather predictions, their analysis are often at a regional scale with aggregate measures of water consumption. To date, there exists limited empirical evidence about how urban spatial structure and concomitant socio-demographic and temperature characteristics mutually interact to affect water demand at the scale of individual land uses. In this presentation, we use geographic information systems and statistical techniques to assess the role of social and biophysical factors as they impact water use. At the regional scale, our results suggest that specific thresholds of density can improve water conservation efforts, and at the parcel scale, several sociodemographics and structural attributes, including lot and building size, help explain over 75% of water use behavior. In addition, our results suggest a strong and significant relationship between urban heat and water consumption. Based on our results, we develop future water use scenarios and provide recommendations to water managers and land use planning bureaus to improve urban water management during alternative climate scenarios.

BIO: Vivek Shandas is an assistant professor in the Toulan School of Urban Studies and Planning, and a Research Associate in the Center for Urban Studies at Portland State University. His research interests focus on three areas: (1) the impact of urban development patterns on water quality; (2) drivers of human behavior and decision making; and (3) effectiveness of interdisciplinary approaches in higher education. He teaches graduate courses in geographic information systems, environmental planning, and research methods in urban studies. Vivek has an undergraduate degree in biology (BS), graduate degrees in economics (MS) and environmental policy (MS), and completed his doctoral work in Urban Design and Planning at the University of Washington (Seattle). He publishes widely in social and natural science journals, and serves as a technical advisor on several local and State organizations.

[return to top]


 

 

DATE: May 7, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Anthony Rufolo

TOPIC: costs of and behavior responses to congestion-pricing systems

ABSTRACT: Fuel taxes are an important source of funds for roads. However, increasing fuel efficiency and the potential for alternative fueled vehicles have raised questions about the long-term viability of this revenue source. Oregon conducted an experiment to evaluate the potential to replace fuel taxes with mileage fees. The findings from that experiment will be presented along with some discussion of current research on the cost associated with implementing different types of mileage fees.

BIO: Dr. Rufolo is a Professor of Urban Studies and Planning at Portland State University, where he specializes in State and Local Finance, Transportation, Urban Economics, and Regional Economic Development. He has a B.S. in Economics from M.I.T. and a Ph.D. in Economics from UCLA. Prior to joining the faculty at Portland State in 1980, he spent six years as an Economist and Senior Economist with the Federal Reserve Bank of Philadelphia. Dr. Rufolo’s research has appeared in such journals as the National Tax Journal, Transportation Research, Transportation Research Record, The Journal of Urban Economics, Land Economics and The Journal of Public Economics, and he is co-author of a textbook on public finance. Dr. Rufolo has practical experience with local economic development and finance issues in addition to his research and teaching. His experience with government forecasting and budgeting includes: Advisory Council (chair) to the (Oregon) Legislative Task Force On Comprehensive Revenue Restructuring, 2008-2009; Blue Ribbon Commission on Cost Allocation, Oregon Department of Transportation, 1996; (Oregon) Governor's Council of Economic Advisors 1983-1994; City of Beaverton Budget Committee 1989-1995 (chair 1992-1994); Advisory Committee on the Budget for Tri-Met (the Portland transit system) 1991-1995 (chair 1994-95); and the Investment Advisory Committee for the city of Portland since 1992.

[return to top]


 

 

DATE: April 30, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTERS: Todd Duncan and James Butler

TITLE: "What makes a meaningful universe?"

ABSTRACT: A common line of thinking says that although we feel subjectively that our thoughts and actions matter in some way, this perception is an illusion. According to this view, an honest look around at the universe shatters this myth and reveals that our lives are ultimately meaningless. If we are to be hard-nosed realists, limiting ourselves to scientific, evidence-based reasoning, then we must accept that human existence is an inconsequential accident of no ultimate significance in the grand scheme of things. Is this attitude really justified by the evidence? We'll explore this question by taking a step back and asking what properties a hypothetical "meaningful universe" might have if we had complete freedom to set it up from scratch.

BIOS:

Todd Duncan is a cosmologist whose work is guided by the theme of better understanding how our immediate human experiences connect to a cosmic perspective that gives them meaning. He combines a research background in physics with experience teaching science concepts to a wide range of audiences. He’s the author of An Ordinary World: The Role of Science in Your Search for Personal Meaning, and coauthor of Your Cosmic Context: An Introduction to Modern Cosmology. Todd received his undergraduate degree in physics from the University of Illinois, an M. Phil. from Cambridge University as a Churchill Scholar, and a doctorate in astrophysics from the University of Chicago where he was an NSF and McCormick Fellow. He joined the faculty of the Center for Science Education at Portland State University in 1997 to pursue his interest in interdisciplinary "big questions" research and its application to science education. In 1998 he founded the Science Integration Institute as a forum for exploring what it means to be human in the universe as understood by modern science. He is currently director of the Science Integration Institute and adjunct faculty in the Center for Science Education at PSU and the Physics Dept. at Pacific University.

James Butler is an experimental physicist whose career has focused on teaching at the undergraduate level and involving students in nonlinear optics research. He is co-author of twenty-four publications and professional conference presentations (many with undergraduate students) and he has been principal investigator for over $650,000 in grants from agencies such as the National Science Foundation, Research Corporation, M.J. Murdock Charitable Trust, and Naval Research Laboratory (NRL). James received a B.S. in physics from Eastern Oregon University and both an M.S. and Ph.D. in physics from Lehigh University. He joined the faculty at the United States Naval Academy in 1999 as an Assistant Professor of Physics to pursue his passions of teaching and experimental optics research. Shortly thereafter James began collaborating with colleagues at NRL in order to investigate the nonlinear optical properties of materials used for protection from high intensity laser damage. In 2004, James joined the faculty at Pacific University as an Associate Professor of Physics where he has played an active role in the development and implementation of innovative teaching methods and has continued his nonlinear optics research with undergraduate students and NRL scientists. James is currently Director of Undergraduate Research and Chair of the Physics Department at Pacific University.

[return to top]


 

 

DATE: April 23, 2010, 1:30 pm

LOCATION: Symposium on Smart Grid Development, Keynote Lecture
University Place, Willamette Room, 310 SW Lincoln Street

PRESENTER: Ganesh Kumar Venayagamoorthy

TITLE: "Smart Grid: the need for advanced computational methods and intelligence"

ABSTRACT: The modern electric power grid with renewable energy resources is a complex adaptive system under semi-autonomous distributed control. It is spatially and temporally complex, non-convex, nonlinear and non-stationary with a lot of uncertainties. The integration of plug-in hybrid and electric vehicles increases the complexity and challenges to the various controllers at all levels of the power grid. Charging large number of electric vehicles randomly or simultaneously without an intelligent infrastructure will increase the load on the electric grid causing adverse effects and increase in cost of electric vehicle usage. Intelligent scheduling of vehicles for charging and dynamic load forecasting will become of vital importance. On the other hand, electric vehicles with the use of vehicle-to-grid technology (V2G), information technology and advanced computational methods can provide short-term real and reactive power support to overcome the drawback of the intermittent nature of wind and solar power resources. Besides, V2G technology can make the electric grid efficient, reliable, distributed, clean and interoperable. This talk will present the potentials and promises of advanced computational methods and intelligence for the smart grid.

BIO: Ganesh Kumar Venayagamoorthy received his Ph.D. degree in electrical engineering from the University of KwaZulu Natal, Durban, South Africa, in Feb. 2002. Currently, he is an Associate Professor of Electrical and Computer Engineering, and the Founder and Director of the Real-Time Power and Intelligent Systems (RTPIS) Laboratory at Missouri University of Science and Technology (Missouri S&T). He was a Visiting Researcher with ABB Corporate Research, Sweden, in 2007. His research interests are in the development of advanced computational algorithms for real-world applications, including power systems stability and control, smart grid applications, sensor networks and signal processing. He has published two edited books, five book chapters, and over eighty refereed journal papers and 270 refereed conference proceeding papers. He has been involved in approximately US $7 million of competitive research funding. Dr. Venayagamoorthy is a recipient of several awards, including a 2007 US Office of Naval Research Young Investigator Program Award, a 2004 US National Science Foundation CAREER Award, the 2010 Innovation Award from the Academy of Science of St. Louis, the 2008 IEEE St. Louis Section Outstanding Educator Award, the 2006 IEEE Power Engineering Society Walter Fee Outstanding Young Engineer Award, the 2005 IEEE Industry Applications Society (IAS) Outstanding Young Member Award, and the 2003 International Neural Network Society (INNS) Young Investigator Award. He is a Fellow of the Institution of Engineering and Technology (IET), UK and the South African Institute of Electrical Engineers, a Senior Member of the IEEE and INNS, and a Member of the American Society for Engineering Education and the INNS Board of Governors. He is on the editorial board of the new IEEE Transactions on Smart Grid.

[return to top]


 

 

DATE: April 16, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Martin Zwick

TITLE: "Holism and human history"

ABSTRACT: This paper uses a systems-theoretic model to structure an account of human history. According to the model, a process, after its beginning and early development, often reaches a critical stage where it encounters some limitation. If the limitation is overcome, development does not face a comparable challenge until a second critical juncture is reached, where obstacles to further advance are more severe. At the first juncture, continued development requires some complexity-managing innovation; at the second, it needs some event of systemic integration in which the old organizing principle of the process is replaced by a new principle. Overcoming the first blockage sometimes occurs via a secondary process that augments and blends with the primary process and is subject in turn to its own difficulties.

Applied to history the model joins together the materialism of Marx and the cultural emphasis of Toynbee and Jaspers. It describes human history as a triad of developmental processes which encounter points of difficulty. The 'primary' process began with the emergence of the human species, continued with the development of agriculture, and reached its first critical juncture after the rise of the great urban civilizations. Crises of disorder and complexity faced by these civilizations were eased by the religions and philosophies that emerged in the Axial period. These Axial traditions became the cultural cores of major world civilizations, their development constituting a 'secondary' process that merged with and enriched the first. This secondary process also eventually stalled, but in the West the impasse was overcome by a 'tertiary' process: the emergence of humanism and secularism and--quintessentially--the development of science and technology. This third process blended with the first two in societal and religious change that ushered in what we call 'modernity.' Today, this third current of development falters, and inter-civilizational tension also afflicts the secondary stream. Much more seriously, the primary process has reached its second and critically hazardous juncture--the current global environmental-ecological crisis. System integration via a new organizing principle is needed on a planetary scale.

This paper was prepared for "Cosmos, Nature, and Culture: A Transdisciplinary Conference," July 18-21, 2009, in Phoeniz, AZ, a program of the Metanexus Institute.

BIO: Martin Zwick was awarded his Ph.D. in Biophysics at MIT in 1968, and joined the Biophysics Department faculty of the University of Chicago in 1969. Initially working in crystallography and macromolecular structure, his interests shifted to systems theory and methodology, the field now known as the study of chaos, complexity, and complex adaptive systems. Since 1976 he has been teaching and doing research in the Systems Science PhD Program at Portland State University; during the years 1984-1989 he was director of the program.

His main research areas are information-theoretic modeling, machine learning, theoretical biology, game theory, and systems theory and philosophy. Scientifically, his focus is on applying systems theory and methodology to the natural and social sciences, most recently to biomedical data analysis, the evolution of cooperation, and sustainability. Philosophically, his focus is on how systems ideas relate to classical and contemporary philosophy, how they offer a bridge between science and religion, and how they can help us understand and address societal problems.

[return to top]


 

 

DATE: April 9, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Diana Fisher

TITLE: "Teaching high school mathematics using system dynamics models"

ABSTRACT: There is a serious push from the national level to get all high school students through second year algebra. This has proven to be extremely difficult using current methods. Algebra, pre-calculus, and calculus have as an overarching goal a deeper understanding of how functions behave. The use of System Dynamics models to supplement traditional representations has been useful, both in providing a visual representation of traditional functions studied and in allowing more realistic applications for student assignments. Additionally, a year-long System Dynamics modeling course has shown that high school students can research, build, and explain the complex behavior found in many different types of systems. The student work is impressive and offers a window into an extremely promising approach to the study of more realistic, more interesting, and more powerful learning methodology (and is hands-on and student-centered). This approach appeals to a broad audience of students, those for whom equations are too much of an abstraction to allow them to retain needed function characteristics, and those for whom the current approach does not offer the ability to extend their strong analytical skills (as can be accomplished studying complex interactions between functions). System Dynamics modeling provides a cutting edge approach to help teachers address the national educational standards in math, science, health, economics, social science, technology, sustainability, and 21st Century Skills. This talk will give a brief overview of the types of lessons used in high school advanced algebra and modeling classes, and present a videotaped student presentation of his System Dynamics modeling project.

BIO: Diana Fisher received both a bachelors and masters degree in mathematics, the first from University of Texas El Paso (1969), the second from University of Montana (1976). She has taken classes in computer science at Oregon State University (1983). She will receive a Graduate Certificate in System Dynamics from Worcester Polytechnic Institute in May. She has been accepted into the Ph.D. System Science program at PSU and will start her coursework in the fall of 2010. She has been a teacher of mathematics for about 40 years, and a teacher of computer science, and system dynamics modeling for the past 20 years. She wrote and directed two National Science Foundation grants: CC-STADUS (Cross-Curricular Systems Thinking and Dynamics using STELLA) (1993-1997) and CC-SUSTAIN (Cross-Curricular Systems Using STELLA, Training and Inservice) (1997-2001). She was awarded the Presidential Award for Excellence in Mathematics Teaching representing the state of Oregon in 1995, and was first place co-winner of Intel's Innovation in Teaching award in 1996. She has published five books, the first three in computer programming, published in the 1980's by Computer Science Press, and the last two within the last 10 years on the teaching of System Dynamics Modeling in mathematics, published by isee systems, inc. She has put together a website to highlight some of her students' work to try to help convince people that the SD approach to problem solving is a powerful learning methodology for students. The website can be accessed at: http://www.ccmodelingsystems.com/

[return to top]

 


 

 

DATE: April 2, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: James McNames

TITLE: "Reconstructability analysis of elementary cellular automata"

ABSTRACT: Clinical trials in Parkinson's disease and other movement disorders currently depend on rating scales that require subjective visual assessment of movement impairment. Dr. McNames has been collaborating closely with the Parkinson's Center of Oregon to develop objective measures of movement impairment. These have the potential to provide more precise measures that make it easier to distinguish the effects of new therapies from the placebo response in blinded randomized-control clinical trials. However, this is a challenging modeling application because there is no gold standard and data is extremely expensive to collect. This talk will give a summary of this on-going work and a summary of some of the key lessons learned over the last 2 years.

BIO: James McNames received a B.S. degree in electrical engineering from California Polytechnic State University, San Luis Obispo, CA, in 1992. He received M.S. and Ph.D. degrees in electrical engineering from Stanford University, Stanford, CA, in 1995 and 1999, respectively.

He has been with the Electrical and Computer Engineering Department at Portland State University, Portland, OR since 1999, where he is currently an Associate Professor. He has published over 100 peer-reviewed journal and conference papers. His primary research interest is statistical signal processing with applications to biomedical engineering with a long-term goal of deploying technologies that use closed-loop control to improve health care.

He founded the Biomedical Signal Processing (BSP) Laboratory (bsp.pdx.edu) in fall 2000. The mission of the BSP Laboratory is to advance the art and science of extracting clinically significant information from physiologic signals. Members of the BSP Laboratory primarily focus on clinical projects in which the extracted information can help physicians or medical devices make better critical decisions and improve patient outcome.

[return to top]

 


 

 

WINTER 2010 SCHEDULE

 


DATE: March 12, 2010, Noon

LOCATION: Harder House, Room 104

PRESENTER: Martin Zwick

TITLE: "Reconstructability analysis of elementary cellular automata"*

ABSTRACT: Reconstructability analysis is a method to determine whether a multivariate relation, defined set- or information-theoretically, is decomposable with or without loss (reduction in constraint) into lower ordinality relations. Set-theoretic reconstructability analysis (SRA) is used to characterize the mappings of elementary cellular automata. The degree of lossless decomposition possible for each mapping is more effective than the λ parameter (Walker & Ashby, Langton) as a predictor of chaotic dynamics.

Complete SRA yields not only the simplest lossless structure but also a vector of losses of all decomposed structures, indexed by parameter, τ. This vector subsumes λ, Wuensche’s Z parameter, and Walker & Ashby’s “fluency” and “memory” parameters within a single framework, and is a strong but still imperfect predictor of the dynamics: less decomposable mappings more commonly produce chaos. The set-theoretic constraint losses are analogous to information distances in information-theoretic reconstructability analysis (IRA). IRA captures the same information as SRA, but allows λ, fluency, and memory to be explicitly defined.

*This talk is being given to fill in an unexpected vacant slot in our seminar series. It is a repeat of a talk that was given in the past, but not everyone has heard it, and these results are still unpublished. ( But apologies to anyone taking the seminar course who has already heard it!) -MZ

BIO: Martin Zwick was awarded his Ph.D. in Biophysics at MIT in 1968, and joined the Biophysics Department faculty of the University of Chicago in 1969. Initially working in crystallography and macromolecular structure, his interests shifted to systems theory and methodology, the field now known as the study of chaos, complexity, and complex adaptive systems. Since 1976 he has been teaching and doing research in the Systems Science PhD Program at Portland State University; during the years 1984-1989 he was director of the program.

His main research areas are information theoretic modeling, machine learning, theoretical biology, game theory, and systems theory and philosophy. Scientifically, his focus is on applying systems theory and methodology to the natural and social sciences, most recently to biomedical data analysis, the evolution of cooperation, and sustainability. Philosophically, his focus is on how systems ideas relate to classical and contemporary philosophy, how they offer a bridge between science and religion, and how they can help us understand and address societal problems.

[return to top]

 

 


 

DATE: March 5, 2010

LOCATION: Harder House, Room 104

PRESENTER: Patrick Roberts

TITLE: "Computational pharmacology: simulating circuits of the brain for drug development"

ABSTRACT: The pharmaceutical industry is approaching unsustainable research costs to develop new drug therapies for mental disease because of the high failure rate in clinical trials. These failures are due to limitations of pre-clinical studies in animal models that fail to predict the efficacy of new drugs in human subjects. The gap between pre-clinical trials and clinical trials is particularly difficult in complex mental diseases such as schizophrenia because of the complex dynamics of the brain and the multiple chemical pathways that drugs can affect.

However, many biological mechanisms associated with schizophrenia are now understood, and computational power and methods have reached the point for practical modeling of pathologies of schizophrenia. Numerical models can combine the information from animal studies of brain circuitry with data from human clinical trials of drug actions. Furthermore, complex interactions of multiple receptor targets can be predicted by a biophysical model of brain function.
This presentation will introduce numerical models of neuronal microcircuitry that are associated with symptoms of schizophrenia. The emphasis will be on the dynamics of these neural systems and how their dynamics are modified by antipsychotic drugs. Unlike the current state-of-the-art methods of estimating therapeutic efficacy, the computational platform yields a significant increase in the predictive correlation with data from clinical trials.

BIO: Dr. Roberts received his BA in physics at Reed College ('83) and his PhD in elementary particle physics and Gothenburg University in Sweden. Since completing his PhD in 1993, he has focussed on neuroscience research in both academia and industry on projects ranging from electrosensory processing in electric fish to pharmaceutical research for mental illness. He is presently employed at In Silico Biosciences and holds adjunct positions in Biomedical Engineering at Oregon Health & Sciences University and in the Systems Science Program at Portland State University.

[return to top]

 


 

DATE: Feburary 26, 2010

LOCATION: Harder House, Room 104

PRESENTER: Dan Hammerstrom

TITLE: "An update on biologically inspired computing: The DARPA SyNAPSE program and Hierarchical Temporal Memories"

ABSTRACT: This presentation provides an update on biologically inspired computation. In particular, it focuses on two important developments in this area, the DARPA SyNAPSE program (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) and the HTM (Hierarchical Temporal Memory) being developed by Numenta.

The SyNAPSE Program’s ultimate goal is to build a low-power, compact electronic chip combining novel analog circuit design and a neuroscience-inspired architecture that can address a wide range of cognitive abilities: perception, planning, decision making and motor control. According to DARPA program manager Todd Hylton, “Our research progress in this area is unprecedented, No suitable electronic synaptic device that can perform critical functions of a biological brain like spike-timing-dependent plasticity has ever before been demonstrated or even articulated.”

The HTM algorithm is the work of Jeff Hawkins and Dileep George. Jeff (Palm Pilot inventor) founded the Redwood Neuroscience Institute, from which has emerged a synthesis of a number of existing and new ideas of cortical operation. The models have worked so well that he has now spun out a company, Numenta, Inc.

HTMs use a unique combination of the following ideas:

- A hierarchy in space and time to share and transfer learning;
- Slowness of time, which, combined with the hierarchy, enables efficient learning of intermediate levels of the hierarchy;
- Learning of causes by using time continuity and actions;
- Models of attention and specific memories;
- A probabilistic model specified in terms of relations between a hierarchy of causes; and
- Belief Propagation in the hierarchy to use temporal and spatial context for inference.

BIO: Dan Hammerstrom received the BS degree from Montana State University, the MS degree from Stanford University, and the PhD degree from the University of Illinois. He was and an Assistant Professor in the Electrical Engineering Department at Cornell University from 1977 to 1980.

In 1980 he joined Intel in Oregon, where he participated in the development and implementation of the iAPX-432, the i960, and iWarp. In 1988 he founded Adaptive Solutions, Inc., which specialized in high performance silicon technology (the CNAPS chip set) for image processing and pattern recognition. He is now a Professor in the Electrical and Computer Engineering Department and Associate Dean for Research in the Maseeh College of Engineering and Computer Science at Portland State University.

Prof. Hammerstrom has joint appointments in the IDE (Information, Computation, and Electronics) Department at Halmstad University, Halmstad, Sweden and in the BioMedical Engineering Department of the Oregon Health & Science University.

[return to top]

 


 

DATE: Feburary 19, 2010

LOCATION: Harder House, Room 104

PRESENTER: Miguel Figliozzi

TITLE: "Vehicle routing problems in congested urban areas"

ABSTRACT: This talk will discuss vehicle routing problems in congested urban areas. Modeling approaches, data collection issues, and solution algorithms to solve real-world problems will be described and analyzed.

BIO: Dr. Miguel Andres Figliozzi's main research areas are transportation and logistics systems modeling and optimization. He joined Portland State University in August 2007. His previous academic appointment was at the University of Sydney School of Business. Dr. Figliozzi works in partnership with local, regional, state, and federal transportation agencies, and he is a member of the Transportation Research Board Network Modeling Committee and Transportation Research Board Network Intermodal Terminal Design Committee.

[return to top]

 


 

DATE: February 12, 2010

LOCATION: Harder House, Room 104

PRESENTER: Lars Holmstrom

TITLE: "Efficient encoding of vocalizations in the auditory midbrain"

ABSTRACT: An important question in sensory neuroscience is what coding strategies and mechanisms are used by the brain to detect and discriminate among behaviorally relevant stimuli. To address the noisy response properties of individual neurons, sensory systems often utilize broadly tuned neurons with overlapping receptive fields at the system's periphery, resulting in homogeneous responses among neighboring populations of neurons. It has been hypothesized that progressive response heterogeneity in ascending sensory pathways is evidence of an efficient encoding strategy that minimizes the redundancy of the peripheral neural code and maximizes information throughput for higher level processing. This hypothesis has been partly supported by the documentation of neural heterogeneity in various cortical structures.

This dissertation will examine whether selective and sensitive responses to behaviorally relevant stimuli contribute to a heterogeneous and efficient encoding in the auditory midbrain. Prior to this study, no compelling experimental framework existed to address this question. Stimulus design methodologies for neuroethological experiments were largely based on token vocalizations or simple approximations of vocalization components. This dissertation describes a novel state-space signal modeling methodology which makes possible the independent manipulation of the frequency, amplitude, duration, and harmonic structure of vocalization stimuli. This methodology was used to analyze four mouse vocalizations and create a suite of perturbed variants of each of these vocalizations. Responses of neurons in the mouse inferior colliculus (IC) to the natural vocalizations and their perturbations were characterized using measures of both spike rate and spike timing. In order to compare these responses to those of peripheral auditory neurons, a data-driven model was developed and fit to each IC neuron based on the neuron's pure tone responses. These models were then used to approximate how peripheral auditory neurons would respond to our suite of vocalization stimuli. Using information theoretic measures, this dissertation argues that selectivity and sensitivity by individual neurons results in heterogeneous population responses in the IC and contributes to the efficient encoding of behaviorally relevant vocalizations.

BIO: Lars Holmstrom is a Ph.D. candidate in the Systems Science Graduate Program at Portland State University. His research has primarily been focused on reinforcement learning in artificial neural networks and sensory processing in real neural networks. He is currently working as a software architect for a local biomedical device company and as an environmental consultant responsible for model based estimates of avian mortality risk resulting from wind farm installations.

[return to top]

 


 

DATE: February 5, 2010

LOCATION: Harder House, Room 104

PRESENTER: Will Landecker

TOPIC: "Understanding classification decisions for object detection"

ABSTRACT: Computer vision systems are traditionally tested in the object detection paradigm. In these experiments, a vision system is asked whether or not a specific object--for example an animal--occurs in a given image. A system that often answers correctly is said to be very accurate. In this talk, we will discuss some ambiguity that exists in this measure of accuracy. We will also propose a new measure of object-detection accuracy that addresses some of this ambiguity, and apply this measure to the hierarchical "standard model" of visual cortex.

BIO: Will Landecker obtained his B.A. in mathematics from Reed College, and is currently a PhD student in the PSU Computer Science program and a graduate research assistant at Los Alamos National Laboratory. He is conducting his research as a member of Melanie Mitchell's machine vision group. His research focuses on understanding the decisions of machine learning classifiers, particularly as they apply to computer vision systems. This work combines computer vision, theoretical machine learning, and data visualization. Other research interests include music informatics and computational neuroscience.

[return to top]

 

 


 

DATE: January 29, 2010

LOCATION: Harder House, Room 104

PRESENTER: Barry Oken

TITLE: "Human central system electrophysiology: introduction and signal processing"

ABSTRACT: This seminar will be an overview of analysis methods for brain electrophysiology. It will be a broad overview of electroencephalography (EEG) and evoked potentials. Specific examples both from the research and clinical arenas will be used. The overview of EEG will include what it is measuring, frequency analysis, and independent component analysis; the overview of evoked potentials will introduce what they are and the different types of stimuli including conventional sensory as well as cognitive, intraoperative monitoring, and transcranial stimulation.

BIO: Barry Oken received a BA degree in math from the University of Rochester in 1974 and an M.D. degree from the Medical College of Wisconsin in 1978. He was a resident in Neurology from 1980-1983 at Boston University Medical Center and a Fellow in Electroencephalography and Evoked Potentials at Massachusetts General Hospital from 1983-1985. Since 1985, he has been a member of the faculty at Oregon Health & Science University (OHSU) and is currently Professor in the Departments of Neurology and of Behavioral Neuroscience. Also, he is medical director of the Clinical Neurophysiology Department. His primary research interests are in cognitive neuroscience with a focus on age-related changes using brain physiology as one of the assessment tools and on interventions that may modify those changes. He has published 150 papers, abstracts and chapters in a range of fields related to his interests and he has had continuous NIH funding for his research for the past 20 years.

[return to top]

 


 

DATE: January 22, 2010

LOCATION: Harder House, Room 104

PRESENTER: Grant Kirby

TITLE: "Reviewing the role of systems analysis in data networks and the possible role for system theories going forward"

ABSTRACT: Data networks have been a very important catalyst in the growth of business in the US. As data networks became more complex in the 1970s and 1980s it was necessary to implement system analysis methodologies for the more complex networks. Over the last three decades data networks have become increasingly more complex and organizations and governments are increasingly more dependent on their services. The new applications and services being ushered in this next decade may well render current methodologies ineffective. The goal of this talk is to begin the dialogue about how system theories might make significant contributions to the designs of the next generation information systems.

BIO: Grant Kirby is a PhD candidate in the Systems Science program at PSU. He received his MBA from University of Oregon and a computer science undergraduate from OIT. Grant is currently a Program Director at Oregon Institute of Technology over the Information Technology and Operations Management programs. Before he came to OIT in 2003, he spent twelve years at Intel Corp. as a technical Marketing and Product Marketing engineer.

[return to top]

 


 

DATE: January 15, 2010

LOCATION: Harder House, Room 104

TIME: 12 noon - 1 pm

PRESENTER: Tamara Hayes
TITLE: "Aging-in-place research at ORCATECH: Making sense of the data"
ABSTRACT: The Oregon Center for Aging and Technology (ORCATECH) seeks to facilitate successful aging and reduce the cost of healthcare by establishing the evidence base for technologies supporting aging-in-place research and care. This is done through pilot studies evaluating the role of the technologies, as well as large longitudinal studies in which sensors are placed in the homes of community-dwelling elders to monitor daily patterns of activity, walking speeds, medication adherence, and other behaviors. These sensors collect continuous data that reflect normal variability in behaviors as well as trends that may indicate problematic changes in cognition or mobility. Because data are collected continuously, trends can be identified long before they would become apparent during a typical clinic visit. However, the use of low-cost sensors means that the data are inherently noisy, and extraction of meaningful behavioral data is not trivial. In this talk I will give an overview of the challenges inherent in this approach, and will describe some of the analyses that are proving fruitful.
BIO: Dr. Hayes received her MS in Electrical Engineering at the University of Toronto and her PhD in Behavioral Neuroscience from the University of Pittsburgh. She was worked in both industry and academia, on projects ranging from creating tools for remote management of distributed systems to delivering teledermatology care to rural Oregon. Dr. Hayes' current research interests include the use of technology to deliver health care in the home, with the goal of changing the current paradigm of clinic-centered healthcare to a model that is less costly, more effective, and allows an individual to participate more fully in their own health care. This research entails the use of low-cost unobtrusive sensors in the home for collecting behavioral data related to acute and chronic motor and cognitive changes, and meaningful analysis of these data to assist and inform the patient. Dr. Hayes is Assistant Professor and Associate Department Head in the Division of Biomedical Engineering at Oregon Health and Science University’s School of Medicine.

[return to top]

 


 

DATE: January 8, 2010

LOCATION: Harder House, Room 104

TIME: 12 noon - 1 pm

FACILITATOR: Joshua Hughes

TOPIC: "Criticisms of systems science"

A new year often begins with a sense of optimism, but we (ever the contrarians) will begin it with a healthy dose of pessimism. This week's seminar will be a discussion about criticisms of systems science. As Winston Churchill said, "Criticism may not be agreeable, but it is necessary. It fulfills the same function as pain in the human body. It calls attention to an unhealthy state of things." Is the systems project in an unhealthy state? Since its emergence in the 1940s and 1950s, a number of people have believed that to be the case, and a few have issued strong--and long--critiques of the systems view. A few of the most notable have come from R. C. Buck (1956), Ida Hoos (1972), and Robert Lilienfeld (1978). As George Klir notes--his 2001 book Facets of System Science will provide a good deal of the material for our discussion--some of this criticism was ill-conceived and easily refuted; but some was indeed justified, and addressing the "unhealthy" aspects of systems science changed it for the better. No doubt it behooves the systems thinker to be familiar with these criticisms both justified and unjustified: knowing the "justified" criticisms will (hopefully) prevent us from repeating the mistakes of the past and provide us with a deeper understanding of the systems project's development; knowing the "unjustified" criticisms can provide us with an understanding of how the systems field is perceived by those outside it and (perhaps) motivate us to improve the way we communicate our ideas.

To give everyone a head start, here are some of the criticisms from Buck, Hoos, and Lilienfeld (Buck's criticisms are paraphrased by me, and those of Hoos and Lilienfeld by Lars Skyttner (2005)):

R. C. Buck

  • If every system has subsystems and every system has its environment, one can't think of anything or any combination of things that isn't a system; if the concept of "system" can apply to everything, it is logically empty.
  • The fact that the spread of neural impulses, the spread of rumours, and the spread of epidemics can all be described by similar mathematical models is sheer coincidence. "So what?" if these different systems are seen as analogous.
  • If Joan's heart is the system, and Joan is the environment, isn't Joan's heart--being a part of her--also the environment? So which is it?


Ida Hoos

  • The so-called isomorphisms are nothing but tired truisms about the universality of mathematics, i.e. 2 + 2 = 4 prevails whether we consider soap, chickens, or missiles.
  • Superficial analogies may camouflage crucial differences and lead to erroneous conclusions.


Robert Lilienfeld

  • Systems theory is the latest attempt to create a universal myth based on the prestige of science.
  • Systems thinkers have a special weakness for definitions, conceptualizations, and programmatic statements, all of a vaguely benevolent moralizing nature, without concrete or even scientific substance.
  • In the eyes of the "universality" of systems theory all things are systems by virtue of ignoring the specific, the concrete, and the substantive.
  • Systems theory is a theory with applications which have never been really tested.
  • As a theory, systems philosophy is a mixture of speculation and empirical data, neither of them satisfactory. It is an attempt to stretch a set of concepts into metaphysics that extends beyond and above all substantive areas.
  • Systems theory is not a genuine philosophy and is not a science; it is an ideology and must be considered as such.

In addition to these criticisms, I will also present a number of other criticisms of the systems field (here assumed to encompass the "complexity" sciences as well), including some more recent ones from people within the systems field itself (and perhaps a Nobel-prize-winning economist or two).

BIO: Joshua Hughes is a second year, core-option PhD student and graduate assistant in the PSU Systems Science Graduate Program. He is working on research with George Lendaris on contextual reinforcement learning and experience-based identification and control; he is also collaborating with Martin Zwick on a few papers that show how systems theories might provide insights into some contemporary problems. He is interested in information theory, cybernetics, reconstructability analysis, neural networks, fuzzy logic, catastrophe theory, game theory, and many other things.

[return to top]