Emergent Intentionality

fractal.gifOr, My Fancy Rationale for Indulging in Conspiracy Theories.

New Scientist just ran a story on The Lure of Conspiracy Theory. They claim that:

Conspiracy theories can have a valuable role in society. We need people to think “outside the box”, even if there is usually more sense to be found inside the box. The close scrutiny of evidence and the dogged pursuit of alternative explanations are key features of investigative journalism and critical scientific thinking. Conspiracy theorists can sometimes be the little guys who bring the big guys to account – including multinational companies and governments.

I strongly agree with this position, and consider the natural tension between dogged skepticism and flagrant bootstrapping to be a good methodology for fostering creative scientific thought.

But I think the NS story misses an important angle of conspiracy theories that I have been wondering about lately.

The question I have been wondering about is to what extent can group behavior can be understood or characterized as conscious/willful/intentional. How much ideology do members of a group need to share before their behavior can be understood (and perhaps predicted) as an intentional agent? Is postulating intentionality a useful heuristic for understanding group behavior?

I am not going to follow this idea too far in this post, but this position provides an alternative perspective on theories like the idea that all Peace Corp volunteers are CIA agents, and why theories like this become so popular. Our cognitive capacities are poorly equipped to percieve complex emergent behaviors, and postulating intentionality may serve as a natural (and useful) strategy for capturing these patterns.

I personally trace the philosophical genealogy of this idea to Daniel Dennett’s Intensional Stance, but a friend of mine pointed out that this idea can also be found in Madison’s Federalist Paper #10. The main idea behind Dennett’s intensional stance is that we can bracket the deep, hard, ontological questions about the nature of consciousness and simply observe how useful taking the intensional stance is as a heuristic for understanding other people’s behavior. We posit intentionality which yields reliable predictions about agents (philosophical agents, not the ones working for letter agencies) in the world around us. And we don’t limit the intensional stance to other people either – we regularly adopt this stance with animals and machines, often to great utility.

For whatever its worth, labeling something a conspiracy theory sometimes seems like a pejorative, non-rational critique. Heck, Al-Queda is a conspiracy theory (and an open source project, according to Bruce Sterling’s SXSW ’07 Rant), but perversely, it’s the Power of Nightmare‘s attempt to dispel this fabrication that is labeled the conspiracy.

But, I really want to live in a universe in which we actually landed on the moon.

We are all dying, sick, and crazy

looney_tunes.jpgMy visits to the Informedia lab have consistently generated futuristic ideas (and corresponding posts), and my trip this spring was no exception.

This time I was thinking alot about what kinds of schemas will be employed after their prototype moves beyond watching grandma? When this kind of a system is inevitably rigged up to a school or a prison, or fed raw streams from live surveillance cameras?

My money is on the Diagnostic Statistical Manual of Mental Disorders, an instrument that is arguably becoming the de-facto catalog for the full range of human behavior and experience.

In some respects, this progression parallels the notion that nobody dies of old age anymore – they die of heart failure, cancer, or other diseases. And, as the title of this post cheerily states, we are all dying, we are all sick, and we are all crazy.

As crazy as it sounds, the DSM is poised to become the lens through which we interpret all of human behavior. Given its breadth of coverage, I challenge anyone to find me a normal, healthy individual. It’s ambition reminds me of William James’ Varieties of Religious Experience, except in our generation, the full range of human experience has been radically pathologized.

BTW – the folks who brought us Sexual Orientation Disorder are hard at work on V 5.0 of this catalog – and there is a call out for diagnosis suggestions.

Can you keep a dark secret?

caduceus.jpgThe Alchemist in me feels compelled to respond to the excellent documentary that aired on PBS the other week entitled Newton’s Dark Secret. The film profiled Sir Issac Newton’s fascination with the ancient art/science/craft of Alchemy.

Many of the experts interviewed regarded Newton’s Alchemical experiments to be shameful, perhaps reflecting more on our modern epistemic prejudices than on Newton. Contemporary experts seem threatened by the prospect than anybody in historical times understood things about the world that we don’t.

Beyond the shame of taking Alchemy seriously, they also considered Newton’s alchemy to be his greatest failure. Failure?!? During the period Newton was practicing alchemy he wrote the Principica Mathematica, and also catapulted his way into the power elite – he became knighted, was appointed the head of the Royal Society, and earned power, prestige and wealth beyond his wildest dreams. To this day one of the most respected chairs in physics still bears his name. From this perspective, his alchemical pursuits seem quite successful. Smashingly successful if you consider this blogs tagline “Aurum nostrum non est aurum vulgi” – Our gold is not ordinary gold.

The Alchemists understood metaphor, and it was essential to their theory and practice. Why do most modern thinkers insist upon interpreting the craft so literally?
My girlfriend shared a Bahá’í quote on a related subject.

“Should a man try to fly with the wing of religion alone he would quickly fall into the quagmire of superstition, whilst on the other hand, with the wing of science alone he would also make no progress, but fall into the despairing slough of materialism.” — Abdu’l-Bahá, The Fourth Principle

Or, to paraphrase, Religion without Science is superstition, Science without Religion is reductionism”

I have long believed that Alchemy is a framework which seeks to reconcile spiritual integrity with material wealth, or more broadly, science and religion.

Perhaps the ancients might have been on to something that modern science has truly forgotten. It is tough to challenge Newton’s genius – maybe his alchemical theories deserve a more respectful examination.

“Wait until pictures start getting indexed.”

police_sketch.jpgWell, I called it:

In in class I took with Eben Moglen I predicted in a class discussion that pictures on the internet would soon be indexed:

Re: video cameras (Feb. 11, 2005)

Many people in the class were skeptical

Well, here it is, less than two years later:

Face Search Engine Raises Privacy Concerns

Of course, there are standard objections to the two primary critiques of surviellance “paranioa”.

  1. If I am not breaking the law, why should I care?
  2. There is so much informatoin being gathered, who could possibly sort through it all?

The responses to these objections should be well rehersed.

  1. Pervasive Omniscient Surviellance will have an impact on the basic fabric of personal social and cultural interactions as we currently understand them.
  2. The AI is coming, getting more powerful by the day.

Wonderful Things

testtaker_main.jpgMonday night I went to the ITP’s end-of-semester show. A friend of mine is in the program and I went to check out the scene. ITP, the Interactive Telecommunications Program, is part of the Tisch School of the Arts at NYU. ITP has been around since ’79, and lies somewhere concetually between the MIT Media Lab and Mary Flanagan. When I visited the MIT Media Lab this summer I began to understand how it was really operating as a pooled R & D lab for corporate interests (with plenty of military funding). I got the vibe that ITP is coming from a different place with different priorities, but I don’t really know the full back story.

Here are some of the highlights of the many many projects I saw the other night:

Emerging themes continue to suggest that we are indeed embarking on a era that can be described as “The End of Forgetting“, and that epistemology itself is transforming beneath our feet. That is, the way we know what we know, the kinds of things that we know, and our relationship to knowledge is being transformed by shifts in memory, computational possibilities, simulation, and visualization. Going to a show like this really reinforces these bold predictions.

Another New Kind of Science?

Last weekend’s Cultural Studies conference reminded me of a viscous cycle that many humanities-oriented researchers are being subjected to. Disciplines such as educational research, ethnography, anthropology, cultural studies, sociology etc have effectively been colonized by the methodology of the social sciences and they are being forced to play a numbers game which they may not be suited for.

Many projects striving for credibility are subjected to the tyranny of statistics – forced to transform their qualitative information (interviews, transcripts, first person accounts) into quantitative information through the process of coding. This reduction forces the data into buckets and creates a significant degree of signal loss, all in the name of a few percentages and pie-charts.

Perhaps we have lost sight of the motivation for this reduction – the substantiation of a recognizable, narrative account of a phenomena, supporting an argument. Arguably, the purpose of the number crunching is to provide supporting evidence for a demonstrable narrative. Modern visualization techniques may be able to provide one without all the hassle.

True, this is not always the only reason that qualitative is transformed into quantitative data, but advanced visualization techniques may provide a hybrid form that is more palatable to many of the researchers active in this area, and is still a credible methodology. It seems as if many people are being forced into coding and quantification, when they aren’t thrilled to be doing so. But the signal loss that coding is responsible for, all in the name of measuring, might be unnecessary if people think about using data visualization tools, that comprehensibly present the data, in all of its richness and complexity, as opposed to boiling it down to chi-squared confidence levels (and does this false precision actually make any difference? Does a result of 0.44 vs. 0.53 tell significantly different stories?)

In a thought provoking post on the future of science, Kelly enumerates many of the ways new computing paradigms and interactive forms of communications might transform science. The device that I am proposing here might lead to some of the outcomes Kelly proposes.

For a better idea of the kinds of visualization tools I am imagining, consider some of the visualization work on large email corpora coming out of the M.I.T. media lab, or the history flow tool for analyzing wiki collaborations, but even the humble tag cloud could be adapted for these purposes, as the power of words and visualizing the state of the union demonstrate.

Crucially, tools analogous to Plone’s haystack Product (built on top of the free libots auto-classification/summarizer library) might help do for social science research what auto-sequencing techniques have done for biology (when I was a kid, gene sequences needed to be painstakingly discovered “manually”).

The law firms that need to process thousands of documents in discovery and the commercial vendors developing the next generation of email clients are already hip to this problem – when will the sciences catch up?

For any of this to happen the current academic structure needs to be challenged. The power of journals is already under attack, but professors who already have tenure can take the lead here and pave the road for their students to follow.

Permanent Records

Sonnabend DiagramToday I presented last year’s bioport Part II paper to the 2nd annual Cultural Studies conference at Teachers College.

Permanent Records: Personal, Cultural, and Social Implications of Pervasive Omniscient Surveillance

I think the distilled version of this model if far more digestible and accessible than the papers.

One of my co-panelists is doing some really interesting work with urban
youth in the bronx, and gathering incredible interview materials about
the perceptions of surveillance by these youth, and their forms of
resistance. These stories might help convey the violence of a
surveillance society.

The conference format was a bit disappointing. I can barely believe academics still read their papers to each other at conferences – there are so many things that Open Source does right, including, knowing how to throw a great conference. Even the variety of presentation formats is an idea that needs to spread – BOFs, lighting talks, presentations and posters all create different spaces and dynamics for interactions between participants. The traditional model is so intimidating that it seems like many people are discouraged from participating.

More importantly, the social justice issues and governance models that are being explored by F/OSS projects are really important for the Cultural/Critical studies folks to be considering. It is also shocking how disconnected they are from the freeculture movement, and its theoretical roots. Arguably, the freeculture movement is a shadow struggle, mirroring the struggles for sustainability, and against globalization and the logic of capitalism being conducted in the physical world. But, it may also represent the actual ground on which that struggle is being conducted.

“Michael, are you sure you want to do that?”

Pull over Kitt – you’ve just been lapped.

On Monday November 14th I attended a presentation by Sebastian Thrun, an AI researcher at Stanford U. whose team recently won the Darpa Grand Challenge.

The idea behind the Grand Challenge is to accomplish something that seems impossible, along the lines of crossing the Atlantic, the X-prize, etc. Darpa had previously funded cars that drive themselves, but after numerous failures decided to turn the task into a contest and see how far teams would get in a competitive setting. Last year none of the entrants managed to finish the course, but this year 5 finished, 4 within the alloted time.

The difference between last year and this year was primarily improvements in software, not hardware. In fact, once the software has been developed, outfitting a car with the necessary equipment to drive itself (the perceptual apparatus – laser, radar, and video guidance, the gps, the inertial motion systems, the general purpose computing servers, and the fly-by-wire control systems), were estimated by Sebastian to cost the robots are already here (some of them killer)!

Wikibases and the Collaboration Index

On October 27th I attended a University Seminar presented by Mark Phillipson. The seminar was lively and well attended, and Mark managed to connect the culture of wikis with their open source roots.

Sometime soon I plan on elaborating on ways in which software, as a form of creative expression, inevitably expresses the values of the creators in the form of features. But right now I want to focus on the taxonomy of educational wiki implementations that Mark has identified since he began working with them.

Here is how Mark divides up the space of educational wikis

  • Repository/reference – eg Wikipedia
    • A website whose primary function is to create a repository of knowledge on a particular topic.
  • Gateway – eg SaratogaCensus
    • A website whose primary function is to collect, assemble, and present references to external sources
  • Simulation/role playing – eg Holocaust Wiki
    • A “choose-your-own-adventure” style simulation/game environment
  • ‘Illuminated’/mark-up – eg The Romantic Audience Projects
    • An environment that provides tools for detailed exegesis on primary sources, where the students are instructed to leave the source material unchanged, and create subpages with detailed commentary on supplemental pages.

I think this taxonomy is accurate, but doesn’t completely capture one of the most interesting educational implications of wikis – the process of creating them.

In particular, I can think of a number of variations on the repository/reference wiki, where the final products might all look similar, but where the “collaboration index” might differ substantially (for more on the popularity of the repository/reference, see Database as a Symbolic form, Manovich 2001).

Wikis are a very flexible tool, whose usage can vary from a personal publishing tool, to a simple Content Management System, to a collaborative authoring environment. Additionally, while wiki software doesn’t usually support the enforcement of a strict workflow, policy can be stipulated and adhered to by convention (like in Mark’s class, where the original poems were meant to be left intact).

Consider a few different applications of reference wikis in the classroom:

  • One way Publishing
    • A simple means for instructors to publish and organize information for their class.
    • Examples include:
      • Instructional handbooks, assignment “databases”, completed examples
  • Collaborative Mini-sites and/or subsections
    • Exercises where individuals or groups work on subsections of a wiki which are combined and referenced within a single larger site
    • Examples include:
      • Students dividing large assignments amongst themselves, each sharing their own results with the group.
      • A site like the social justice wiki where groups of 3-4 students each worked on a reference element of the site.
  • Collaborative Websites
    • Sections of the site where everyone in the community is supposed to be contributing content
    • Examples include:
      • Common Resources, Glossary of Terms, and the larger information architecture and organization of the entire site.
  • Portals and Meta-tasks
    • Also, consider that due to their flexibility, many wikis end up being repurposed beyond their original conception, and begin to serve as portals, where many meta-issues and conversations can take place beyond the assembly of the content itself. Some of these tasks include mundane administrative work, like students forming groups, coordinating assignments, taking minutes, and scheduling time.

While the end results of many of these collaborations might certainly all look similar to each other , perhaps the differences in the process by which this content is developed is crucial in capturing part of what is happening with wikis in the classroom.

This analysis probably also has implications relating to the archiving and the use of a wiki environment in a classroom over time. If the act of creating the wiki is central to what the students are supposed to learn from the exercise, then should they start with a fresh wiki every semester? How is the experience different when they are contributing to an existing system (or even have access to prior versions of the project)?

For more on this, see Mark’s comment’s on CCNMTL’s musings blog.

Serenity Lost

Nothing like a little pulp sci-fi to resonate with a class on emerging tech. I saw Serenity tonight (skip this post until you have seen it, unless you aren’t planning to at all) and was amused at how a central plot line revolved around some information that has been covered up by the authorities, and the struggle to disseminate that message.

The simplicity of a single message whose content can change the world, and a single distribution channel from which to broadcast it from is amusing, but poignant. I mean, if you could broadcast one message to the world, what would it be? Are these folksonomies helping in filtering and distributing this information, or are we just ending up on our same disconnected islands of information we started from.

I am thinking of the disjoint sets of books that liberals and conservatives read, but there must be many other examples – perhaps the entire blogosphere falls into this category. One thing I have realized as I begin to rely more and more on my rss client, is that once I am lost inside of it, if you aren’t syndicating a feed, you don’t exist.

I am quite aware that a full-blown information war is currently underway. The existence (and adoption) of Flickr allow me laugh at the Bush administrations attempts to prevent the publication of Katrina’s casualties, but how did this story get swallowed up?

If bittorrent didn’t exist (or was outlawed) and we could not reclaim the “lost” bandwidth of individual broadband subscribers, large file transfers and exchanges would probably have to be mediated through centralized bandwidth providers like akamai or cisco. But this is not quite as simple as centralized vs. decentralized publishing models, since that is only half the equation. The information retrieval needs to happen on the other end, or else you’re screaming into an abyss.

I was once lucky enough to find myself in a conversation with the author of citeulike. I casually inquired as to whether he was planning on releasing the engine which powers his site under an open license. He replied that he would, but that it would be a bad idea. citeulike is supposed to be a service, not a product. Its value is actually diluted the more there are that are running. Part of flickr or delicious’ power are in their popularity. They are much more effective the more users they have, leaving us once again in a paradoxical quandary, where we need a decentralized, centralized service.

Too many flickrs, and they are all rendered weaker, and too few, and we are back in a situation where our information is in danger of being homogenized, controlled, and filtered.

« Previous Page
/* reset the net - http://resetthenet.tumblr.com/post/84330794665/the-reset-the-net-splash-screen */