Press "Enter" to skip to content

The Terrible Tao of Chaotic Career Moves

583047-892362-thumbnail.jpg

Myron Cope and his Terrible Towel: Pittsburgh broadcaster or Chaos Theorist?With a field of study as rich in language and imagery as chaos and fractals, it is inevitable that whole bodies of research will develop that find the theory and results of chaos & fractals applicable in totally improbable situations. It used to be that quantum physics was the leader in this phenomena, with the Tao of Physics by Fritjof Capra the ur-text that promised a much more balanced outlook on life informed by wave/particle duality. (And I will note that I still have my copy.) Given the history of this text, I need to introduce a new category of post, which I openly steal from all Pittsburgh friends and readers - The Terrible Tao. The T-Tao designation is given to applications of chaos and fractals - and I might as well throw in complexity - to the most unlikely social situation.

My goal here is not to criticize these efforts, because they represent attempts to find models for social behavior that are grounded in a well-established field - chaos and fractals - that just happens to yield a range of behaviors that are remarkably similar to human and institutional behavior. Actually, with many of the articles appearing in journals well outside of the natural sciences, the writing often contains a self-contained expository section on nonlinear dynamics because a general knowledge of chaos and fractal theory on the part of the journal’s audience cannot be assumed. So I am glad that the ideas of chaos and fractals reach a larger audience.

With that said, I often find that the modeling is more a use of chaos and fractals as metaphor - a way to describe human situations with exotic terms such as bifurcation, or homoclinic tangle. As a result, I rarely see any predictive value in the modeling, which, as a result, leaves me no farther along in understanding the situation being modeled.

I try not to pass judgement, though - if I can help it.

Without further ado, then, my first T-Tao goes to Complexity, chaos, and nonlinear dynamics: a new perspective on career development theory, in which Deborah P. Bloch posits…

a theory of career development drawing on nonlinear dynamics and chaos and complexity theories. Career is presented as a complex adaptive entity, a fractal of the human entity. Characteristics of complex adaptive entities, including (a) autopiesis, or self-regeneration; (b) open exchange; (c) participation in networks; (d) fractals; (e) phase transitions between order and chaos; (f) search for fitness peaks; (g) nonlinear dynamics; (h) sensitive dependence; (i) attractors that limit growth; (j) the role of strange attractors in emergence; and (k) spirituality, are described and then applied to careers.

Read the actual paper - there are excellent discussions of the items listed in this abstract.

And feel free to comment on the applicability of the suggested model…or to nominate future T-Tao awardees.

Categories Chaos Fractals Modeling Understanding & Prediction

Woodstock and Superconductivity

583047-870737-thumbnail.jpg

Physicist lowers body temperature to achieve a superconducting state.
I came upon a child of god He was walking along the road And I asked him, where are you going And this he told me I’m going on down to Yasgur’s farm - Joni Mitchell

I never made it to Yasgur’s farm - I was just too young in 1969 to head to New York to catch all of the acts that I loved. Later that summer I did make it to the Atlantic City Pop Festival , which was a much-sanitized version. No rain, no mud slides, no babies born…and there was certainly no one making movies or disclaiming about the AC Pop Festival Generation.

Many years later I finally made amends for my serious cultural lapse in ‘69 and attended the next Woodstock - the Woodstock of Physics. This 1987 event was a wild affair as the news of high-temperature superconductivity was just breaking, along with promises of a Jetson-like future soon to be commonplace. There were thousands at the session in NYC, with most (including this author) watching the presentations via monitors in the corridors, straining to hear every word, to make out every blurred overhead with a hastily sketched graph, waiting to hear what would be the next latest (and higher) superconducting temperature…

          <em>We are stardust       We are golden       And we've got to get ourselves       Back to the garden       </em>             

A surprising fact about HTSC is that discovers Karl Müller and Johannes Bednorz received the Nobel Prize that same year -an amazingly short time for the Nobel committee to name an award winner. (They had first noticed the effect only a year earlier.) Typically it is many years between discovery and award, with other experimenters and theorists demonstrating that the original discovery was both true, and truly significant for physics. More amazing, there was no consensus on why HTSC occurs.

And theory has been pretty underwhelming since then.

Now, as the 20th-year anniversary of the HTSC Woodstock is celebrated, there may be a breakthrough. In a May 30th press release by CNRS (The Centre National de la Recherche Scientifique), an arm of France’s Ministry of Research, there is new experimental evidence that may lead to a true understanding of the HTSC phenomena. Note that the connection between understanding and modeling is explicitly stated in this passage:

Since the end of the 1980s (Nobel Prize in 1987), researchers have managed to obtain 'high temperature' superconducting materials: some of these compounds can be made superconducting simply by using liquid nitrogen (77 K, or -196 °C). The record critical temperature (the phase transition temperature below which superconductivity occurs) is today 138 K (-135 °C). This new class of superconductors, which are easier and cheaper to use, has given fresh impetus to the race to find ever higher critical temperatures, with the ultimate goal of obtaining materials which are superconducting at room temperature. However, until now, researchers have been held back by some fundamental questions. What causes superconductivity at microscopic scales? How do electrons behave in such materials? Researchers at the National Laboratory for Pulsed Magnetic Fields, working together with researchers at Sherbrooke, have observed 'quantum oscillations', thanks to their experience in working with intense magnetic fields. They subjected their samples to a magnetic field of as much as 62 teslas (a million times stronger than the Earth's magnetic field), at very low temperatures (between 1.5 K and 4.2 K). The magnetic field destroys the superconducting state, and the sample, now in a normal state, shows an oscillation of its electrical resistance as a function of the magnetic field. Such an oscillation is characteristic of metals: it means that, in the samples that were studied, the electrons behaved in the same way as in ordinary metals. The researchers will be able to use this discovery, which has been eagerly awaited for 20 years, to improve their understanding of critical high-temperature superconductivity, which until now had resisted all attempts at modeling it. The discovery has been effective in sorting out the many theories which had emerged to explain the phenomenon, and provides a firm foundation on which to build a new theory. It will make it possible to design more efficient materials, with critical temperatures closer to room temperature.

This experiment described here employs a fascinating, Zen-like approach to investigate a phenomena. High magnetic fields are used to TURN OFF the superconducting state. In effect, the behavior of electrons when the material is not super-conducting holds the key to what they do when the material is superconducting.

Absence makes the heart grow less conducting? What you don’t see is what you "get"? Let Joni make the call…

Then can I walk beside you I have come here to lose the smog And I feel to be a cog in something turning Well maybe it is just the time of year Or maybe its the time of man I don’t know who l am But you know life is for learning I should note that there have been other "breakthroughs" in modeling HTSC. e.g. in 2005, researchers at the University of Aberdeen announced work that showed the deep connection between lattice structure and superconductivity. Specifically, they found that the HTSC material had a negative thermal-expansion - a rare occurrence.

Categories Physics Understanding & Prediction

Chaos and Compatibilism

fatecitylimit.jpg

Chaos Theory is often cited as a way out of the determinsim-free will paradox. The explanation usually goes like this: OK, everything is determined by physical law, but the laws themselves are non-linear and the output of a particular equation that codifies a law often leads to a pseudo-random process and/or sensitive dependence on initial conditions. Ergo while we are governed by strict physical laws, our behavior - and by implication our decisions - do not give the appearance of being constrained in any way.

The philosophical position that free will can co-exist with determinism pre-dates chaos theory, with Hume and Hobbes early proponents. Known as compatibilism, it stands on its own without chaos theory. Much of the compatibilist argument rests on a careful definition of what it means to act freely, leading to a so-called category error.

The compatibilist definition of free will states that free will is not the ability to choose as an agent independent of prior cause, but as an agent who is not forced to make a certain choice. Determinists argue that all acts that take place are predetermined by prior causes. Because human decision is an act that is not exempt from prior cause, by this definition, some determinists known as hard determinists believe that free will thus becomes an illusion. A compatibilist, or soft determinist, in contrast, will define a free act in a way that does not hinge on causal necessitation. For them, an act is free unless it involves compulsion by another person. Since the physical universe and the laws of nature are not persons, they argue that it is a category error to speak of our actions being forced on us by the laws of nature, and therefore it is wrong to conclude that universal determinism would mean we are never free.

Chaos theory does not really add to this argument - but it does provide an example of a totally deterministic system whose output may be unpredictable. The connection of this system behavior to free will is tenuous at best.

The better way to view the effects of a deterministic system, i.e. us, and free will is provided by John Timpane, an editor of the Philadelphia Inquirer writes in his March 2007 article The Future of Free Will

Sure, we're machines and subject to all the laws that govern everything. No one gets a pass from those. The human mind has a structure and a set of rules - but that produces rather than constrains its ability to make an incredibly large number of novel and creative uses of those rules. Indeed, the more we know about the mind, the more we are stunned by its ability to resist instinct and put the rules to new uses.

For an excellent summary of current trends in the free-will determinism debate, as well as experimental neurological studies that tend to bolster the compatibilist view, see Dennis Overbye’s Jan 2007 NYTimes piece Free Will: Now You Have It, Now You Don’t. In a telling quote, Overbye points out that Einstein had no problem with a deterministic system leading to the appearance only of free-will: “This knowledge of the non-freedom of the will protects me from losing my good humor and taking much too seriously myself and my fellow humans as acting and judging individual.”

For a more skeptical viewpoint, check out Daniel Bader’s Lyceum blog, where he writes on introspection and psychological deteminism. (The FATE city limits sign is from this site)

Categories Chaos Determinism Philosophy

Mathematics Reveals the Artistry

583047-863834-thumbnail.jpg

Brueghel’s Fall of IcarusDaniel Rockmore writes about the mathematical analysis of art in the June 2006 Chronicle (The Style of Numbers Behind a Number of Styles). In the essay Rockmore describes Richard Taylor’s work in analyzing Jackson Pollock pieces, which may be forgeries. ( See my post on this topic)

The Pollock intro is a lead in to a description of stylometry - the mathematical/scientific analysis of literary texts that attempts to address issues of authorship. (See Bookish Math, an excellent intro to stylometry by Erica Klarreich for Science News Online.) Rockmore than describes a method he developed with co-workers Siwei Lyu and Hany Farid that uses wavelet analysis to determine unique "signatures" of different artists - in effect a stylometry for visual images.

The actual mathematics of the wavelet approach can be found in A Digital Technique for Art Authentication. Here the authors use examples of Pieter Bruegel and Perugino to test their model. They claim that their "techniques, in collaboration with existing physical authentication, to play an important role in the field of art forensics."

The wavelet technique is different from Taylor’s fractal analysis of Pollock’s works, but both are examples of stylometry applied to visual information. Both Taylor and Rockmore are attempting to quantify art, an activity that Rockmore admits is unsettling/impossible to some. According to Taylor, this quantification should be expected: "Both mathematics and art are all about pattern…it would be unusual that you would not apply mathematical analysis to the question."

Rockmore is more explicit about what mathematical categorization of art analysis does not do: "Fractal analysis doesn’t diminish Pollock’s athleticism and movement, nature and turbulence, chaos and beauty; it reveals and amplifies it."

For more on this topic, see Can Mathematical Tools Illuminate Artistic Style?, by Sara Robinson for SIAM.

Categories Art Fractals Mathematics

Visualizing The Core of the Blogosphere

583047-868907-thumbnail.jpg

Interactive map of blogosphere. From Hurst. Click to enlargeThere’s been a recent flurry of articles concerning visualization of the web itself, and what such visualization might say about the social networks that live and breathe because of the abilities of the net. This topic is a necessary follow-up, then, to my previous posts on new visualization techniques in web searching.

A conference at UPenn in June 2006 titled The Hyperlinked Society focused on "the effects of digital links on people’s ability to understand and care about their larger society. " The following blurb is from the intro page; the program was quite ambitious:

Most internet users know hyperlinks as highlighted words on a web page that take them to certain other sites. But hyperlinks today are quite complex forms of instant connection—for example, tags, API mashups, and RSS feeds. Moreover, media convergence has led to increased instant linking among desktop computers, cell phones, PDAs, MP3 players, digital video recorders, and even billboards. Through these activities and far more, “links” are becoming the basic forces that relate creative works to one another. Links nominate what ideas and actors have the right to be heard and with what priority. Various stakeholders in society recognize the political and economic value of these connections. Governments, corporations, non-profits and individual media users often work to digitally privilege certain ideas over others. Do links encourage people to see beyond their personal situations and know the broad world in diverse ways? Or, instead, do links encourage people to drill into their own territories and not learn about social concerns that seem irrelevant to their personal interests? What roles do economic and political considerations play in creating links that nudge people in one or the other direction?

One of the participants at the conference was Matthew Hurst, director of science and innovation for Nielsen BuzzMetrics, a company that analyzes Internet trends for businesses. He has created a series of maps of links among the most popular blogs, producing a unique view of "the core of the blogosphere." (See the article in the April 2007 Discover for more about web mapping.)

The maps help visualize the links among blogs, using size, color, and proximity to denote high linkage. Hurst’s techniques allow for a different type of analysis of blogosphere, finding, e.g., that technology and social-policitcal commentary blogs have the most links (no surprise there)

Hurst is one among many who are using different techniques to produce their maps. Most maps have a fractal-quality to them because of all of the multitude of back-and-forth links. Hurst also maintains a blog devoted to mapping and visualization: see Data Mining: Text Mining, Visualization and Social Media.

To see the interactive map at the top of this post in all its linked glory, click here.

Categories Fractals Maps Visualization

Turbulence in Space

583047-865652-thumbnail.jpg

Turbulence in SpaceTrying to model and understand turbulence is one of the main thrusts of chaos theory. So it may be a good thing or bad, depending on where you are chaotically, that more turbulence has been found - this time in deep space.

As reported in APS Physics News for 2006:

If you think chaos is complicated in the case of simple objects (such as our inability to predict the long-term velocities and positions of planets owing to their nonlinear interactions with the sun and other planets) it's far worse for systems with essentially an infinite number of degrees of freedom such as fluids or plasmas under the stress of nonlinear forces. Then the word turbulence is fully justified. Turbulence can be studied on Earth easily by mapping such things as the density or velocity of fluids in a tank. In space, however, where we expect turbulence to occur in such settings as solar wind, interstellar space, and the accretion disks around black holes, it's not so easy to measure fluids in time and space. Now, a suite of four plasma-watching satellites, referred to as Cluster, has provided the first definitive study of turbulence in space. The fluid in question is the wind of particles streaming toward the Earth from the sun, while the location in question is the region just upstream of Earth's bow shock, the place where the solar wind gets disturbed and passes by the Earth's magnetosphere. The waves in the shock-upstream plasma, pushed around by complex magnetic fields, are observed to behave a lot like fluid turbulence on Earth. One of the Cluster researchers, Yasuhito Narita (y.narita@tu-bs.de) of the Institute of Geophysics and Extraterrestrial Physics in Braunschweig, Germany, says that the data is primarily in accord with the leading theory of fluid turbulence, the so called Kolmogorov's model of turbulence. (Narita et al., Physical Review Letters, 10 November)

Kolmogorov is one of the famous trio Kolmogorov - Arnold - Moser. after whom the KAM theorem is named. Ironically, the KAM theorem shows the existence of quasi-periodic orbits in a chaotic solar system. The idea of stability within turbulence is an archetypal chaos construct.

Categories Chaos Mathematics Understanding & Prediction

Modeling Pandemic Strategies

583047-865578-thumbnail.jpg

Spanish Flu in SpokaneModeling how a disease progresses in a pandemic, and the related modeling of the effects of different strategies on stopping the pandemic, are, perhaps next to nuclear attack modeling, some of the most sensitive mathematics being done today.

Consider the difficulties of determining pandemic-containment strategies by looking to past pandemics.

Efforts of several cities to halt the spread of the 1918 Spanish flu have now been analyzed and modeled by several research teams. One technique that appears promising is "social distancing" - referred to as a non-pharmaceutical intervention (NPI) - a fairly obvious strategy of reducing the potential contact between members of the community by closing schools, churches, stores, etc.

I wrote "fairly obvious" - but is it? There are so many contingencies that affected each city that it is hard to draw conclusions. Consider the report of the studies as described by Maryn McKenna of the Center for Infectious Disease Research & Policy at U. Minnesota

But while NPIs make intuitive sense, actual evidence for their ability to block or slow flu transmission has been limited. An Institute of Medicine report released last December concluded that the measures might help in a pandemic but should not be oversold. "It is almost impossible to say that any of the community interventions have been proven ineffective," the report said. "However, it is also almost impossible to say that the interventions, either individually or in combination, will be effective in mitigating an influenza pandemic." The Lipsitch article, coauthored by Richard J. Hatchett, MD, of the National Institutes of Health, and Carter Mecher, MD, of the Department of Veterans Affairs, analyzes the effect of 19 types of NPIs used in 17 US cities during the fall phase of the 1918 pandemic. The authors found that certain types of NPIs, notably school, church, and theatre closings, were more effective than others, and that cities that imposed NPIs early in the epidemic had peak weekly death rates about 50% lower than those of cities that imposed NPIs later or not at all. But, they found, cities that were able to reduce their rates of illness and death during the onset of the pandemic were at greater risk of experiencing a greater second wave of illness and death once restrictions were relaxed. That finding is also supported by Ferguson's paper, coauthored with Martin C.J. Bootsma of Utrecht University, which applies a mathematical model to data from 16 cities where both the start date and the end date for NPIs are known. Cities that enacted NPIs early were able to reduce transmission by up to 50% compared with cities that introduced such measures later in their local epidemics. However, the total mortality declined much less than that—from 10% to 30% in the most successful cities—because the interventions blocked transmission so effectively that many residents were still vulnerable to the virus once the controls were lifted. Few cities maintained distancing measures more than 6 weeks, according to the Lipsitch study.

(Read the full article here)

Note that the susceptibility to the 2nd wave of disease is claimed to be due to the lack of exposure during the first wave. i.e. it must be assumed that those infected in Round 2 would have been either (a) no longer susceptible because of the immunity developed upon the first exposure, or (b) sick in Round 1, and possibly dead.

Indeed, some see the NPI as delaying the inevitable - but nevertheless a procedure that yields precious time:

"The major benefit of delaying transmission is to buy time to develop a vaccine," said Marc Lipsitch, D.Phil., a professor of epidemiology at Harvard School of Public Health and a coauthor of the second paper. "Anything else is just delay."

There are other factors at work - mutation between Rounds 1 and 2, e.g.

The intriguing, and important aspect of the studies is the care with which results and predictions must be made - a knowledge of social interaction and virus mutation must both be taken into account when determining the efficacy of past pandemic strategies, and the likelihood of success for similar strategies in future pandemics.

The above image is by Courtesy Kenneth Knoll of The Pacific Northwesterner. The photo appears in The Spanish Flu in Spokane at HistoryLink.org - The Online Encyclopedia of Washington State History.

Categories Mathematics Modeling Understanding & Prediction

The Art of Biography - Einstein and Goethe

lovelifegoethe.jpg

I recently read an interesting review by J. Parini in the May 11, 2007 Chronicle of Love, Life, Goethe: Lessons of the Imagination From the Great German Poet by John Armstrong. According to Parini, Armstong’s approach is non-chronological, instead focusing on different thematic "nodes" in Goethe’s life. This style of biography is remarkably similar to that of Jürgen Neffe in his recently translated Einstein: An Autobiography. Unlike Isaacson’s best-seller Einstein: His Life and Universe, Neffe presents Einsteins’s life prismatically in chapters that go over the same events, but with different emphases. (See my review of Isaacson and Neffe) Here’s Panini writing about Armstrong’s Goethe:

In the case of Goethe, there were many observers, and the factual record is not much in doubt. Hardly anyone crossed his path who did not feel compelled to record an impression. And so biographers have a wealth of material, some of it quite marvelous. Armstrong plucks the choicest bits from that vast record, but refuses to narrate the life in conventional terms. Instead, he picks 10 key words and gathers his work around those nodes: Luck, Love, Power, Art, War, Friendship, Nature, Peace, Happiness, Death. There is an underpinning of chronology here, as one might expect; but the timeline is folded back upon itself, even discarded for long stretches as Armstrong lunges into meditations on the meaning of the life itself in the context of those seminal words.

The image of "timeline folding back upon itself" is a wonderful way of describing this biography of connected "nodes." Panini goes on to describe the effect of reading such a biography…

Armstrong's approach seems old-fashioned, even anachronistic. But I found myself enthralled by his notions... One can learn a good deal about life as well as art from this book. It will, perhaps, remain an anomalous volume on the shelf of Goethe biographies, but it's the one I plan to reread in future years. I certainly plan to revisit the last section of Armstrong's book, on death — usually the part I skip when reading a biography. For some reason, I love the image of Goethe on his last day, propped in his chair, his mind and body worn out, still writing away — he traced letters with an imaginary pen in the rug that lay across his lap. Would that we knew what he had written.

I feel the same way about Neffe’s Einstein - although it is already a best seller in Germany and much of Europe, where it was released 2 years ago, and will most likely not be considered "anomalous".

Neffe’s work is filled with similar images of Einstein - there’s a real artistry in writing about a life that provides such vivid imagery as well as facts.

Interestingly, both Einstein by Neffe and Armstrong’s Goethe were published by the same firm: Farrar , Strauus, and Giroux. I wonder if this is a coincidence, or there is some philosophical slant to biography writing being fostered by FSG.

Categories Literature & Poetry Philosophy

How to read a REAL Climate Modeling article

583047-859972-thumbnail.jpg

Diagram of General Circulatin ModelBolstered by the anti-climate-modeling stance of Michael Crichton, there are many out there who claim that climate modeling that predicts global warming is somehow "bad science." I’m not sure many of these folks have ever read a real climate modeling paper (or any who believe that climate change is occurring, for that matter)

It is instructive to try to wade through a serious paper that points out the difficulties of modeling on the one hand, but also presents very confident predictions of the model described in the paper..

By chance I recently came across a paper written almost 2 years ago by Jian Yuan, Qiang Fu (Department of Atmospheric Sciences, University of Washington) and Norman McFarlane (Canadian Centre for Climate Modeling and Analysis, Victoria, British Columbia). The name of the article is forbidding: Tests and improvements of GCM cloud parameterizations using the CCCMA SCM with the SHEBA data set.

(Note: GCM is General Circulation Model, or Global Climate Model - the bellwether of climate modeling)

The article describes the wide variability of different models of the Arctic, and how the authors re-formulated cloud interactions, yielding a model whose output is much closer to actual recorded data.

For the uninitiated, trying to read this type of paper seems impossible. You can get a lot from it, though, by reading the abstract, intro, and conclusion. (This is something I do in the Chaos and Fractals course - i.e. have students read and dense papers in areas outside of their majors - an essential activity for all scientists).

This article, and many others like it, point out the organic way in which modeling is done: very rarely does a model work (i.e. match what is observed, or not produce non-physical answers) right out of the box. Instead, there is tinkering - lots of it. Tinkering that is motivated and suggested by physical reality, and sometimes mathematical elegance. The point is that modeling is itself an experimental science, with different researchers adding to a foundational base.

Those who claim that climate science modeling is bad science tend to see these natural discrepancies as some problem with the process of modeling. (Note - I am not describing climate scientists who dispute the models - they are free to make their own models, after all). This argument is classic Straw man fallacy. I plan to make this article a required reading for the upcoming Chaos course, to be juxtaposed with an appropriate Crichton piece.

Note: the image in this post is from the Online Encyclopedia

Categories Media Modeling Understanding & Prediction Weather & Climate

Improve Your Home Run Chances - Walk Softly But Swing a Small Stick

einsteinbball.gif

From Baseball Physics: Anatomy of a Home Run by Davin Coburn in the June 2007 issue of Popular Mechanics, comes some interesting data/trends on home runs. Some of these are contrary to baseball-lifers’ opinions, but I assume that some players will take heed of the predictions. After all, one thing not in the article because it is obvious is the direct relation between salary and home-run prowess.

Check out the article for some interesting graphs that illustrate the following findings:

  • The sweet spot is larger than previously thought
  • Batted Ball Speed (BBS) is more of a determining factor in home runs. This leads to a prediction that increasing the swing speed is better than increasing the bat weight. A corollary to this is that lighter bats (31-32 oz) are ideal bats for pro players.
  • Because of the direction of spin when they reach the plate, curve balls can be hit farther than fast balls, even though fast balls leave the bat traveling faster.
Other interesting facts that can be derived in an introductory physics class is that the average pro swing imparts 4145 pounds of force to the ball, and the farthest a ball can be hit (with no wind to help it, and no rarefied air such as in Colorado) is approx 475 ft.

Although no physicist is quoted by name in the article, I believe that some, if not all, of the topics discussed come from Alan Nathan’s Physics of Baseball work, described in my earlier post on Willie Mays and Global Warming.

Categories Physics Understanding & Prediction