Giving Chickens Microphones

By now you may have heard of the innovative citizen-driven election monitoring system, Twitter Voter Report (they are getting great press cycles, with purportedly more to come).  I actually wrote up and submitted the post that appears on, a wonderful blog that tracks innovations in data visualization.

This projects represents a really innovative use of Twitter as a “just-add-water” (gratis, but not truly free) infrastructure for distributed structured-data collection. It reminded me of a free platform a group at  UNICEF is building to collect distributed structured-data in the third world (for places w/out easy access to the internet, but with cellular connectivity) –  RapidSMS.

Imagine how many millions of dollars the government would have spent to build a cell-phone enabled election monitoring system (that likely wouldn’t work). Instead, a group of volunteer activists, weaned on the open-source, do-it-yourself culture of code jams, shared repositories, and issue trackers, decided less than a month ago that they could build this themselves on a shoestring.

This is definitely a big deal, and relates closely to a new tier of participatory media which I began to describe at my talk at CCNMTL’s New Media in Education conference this month. It also has everything in the world to do with the TagMaps tool I wrote about last November in my post Crowded Wisdom. Systems are coming online which are helping us synthesize vast volumes of tiny fragments of information into meaningful knowledge.

Twitter Vote Report allows anyone to report voter suppression, and problems with specific voting machines, but it support tracking wait times, which will be aggregated and mapped on the website.

Previously, when a voter had a complaint, he had to go through election officials who might have little incentive to admit a mistake – what tech executive Logan calls “a fox guarding the chickens” scenario.

“What this technology does,” he says, “is give the chickens a microphone.” [Baltimore Sun]

Finally, while we are on the topic, this was a great letter to the next president on what he can/should do with technology. How will President Obama utilize the historically unprecedented social networks he mobilized during his campaign?

Dear Mr. Tech President

For more information on Twitter Vote Report see their press page.

Lost in Controversy

This summer, Bruno Latour was our tour guide – leading the way, not out of The Cave, but beyond the entire Cave System. Along the journey I also learned about a very interesting pedagogical technique intended to take engineering students on a similar journey.

Students at Sciences-Politique and Ecole des Mines in Paris, as well as at MIT in Boston are learning to map techno-scientific controversies according to a method which embodies Actor-Network-Theory (without all of the heavy theoretical jargon).  Past projects can be found at the Mapping Controversies web site, and Bruno Latour himself explains the project and its aspirations in this video.

Many of the possibilities explored in these new media projects are related to a broader question I have been interested lately concerning the impact that technology is having on epistemology itself. How is technology and new media changing what is knowable and how we go about knowing?  I wrote an essay last Spring, The Bionic Social Scientist: Human Sciences and Emerging Ways of Knowing, which begins to explore these questions, and it is wonderful to see more examples of these ideas materializing around us.

The Mapping Controversies pedagogy involves teams of students taking on the role of statistician, investigative journalist, scientist, and webmaster, working to research and represent a controversy. They discover (and depict) that concepts themselves vary depending upon who is speaking about them, and attempt to map these relations and progressions over time.

I can imagine this technique displacing the traditional 5 ‘W’s of journalism – The venerable Who, What, When, Where, & Why needs to b upgraded to a multi-dimensional, post-modern, reality. What varies and depends upon who, where, and when, and without the kinds of research and representations that the Mapping Controversies project is pioneering, we will never adequately capture the multiplicities of whys. I don’t know if these kinds of representations are intermediate forms of research, or if one day they will be part of the final production delivered as news to readers, but it is an important question to begin to grapple with.

Right now, the Mapping Controversies sites are somewhat anti-social – they are fixed, one-way communications, but from the introductory video, they hope to change this soon. At the moment, each map is also a unique work of art.  While it is premature to confine anyone yet to the paradigmatic blinders of conformity, I also think it is imperative for us to begin to imagine and develop a visual vocabulary that we can re/use when representing these kinds of relations.

In the field of information visualization, researchers are beginning to catalog Information Design Patterns that maps like this could build upon. Of course, riffs and variations from these patterns are welcome, where significant and meaningful, but a common starting point will improve the communicativity of these maps. As these patterns solidify, the corresponding implementation patterns can grow along with these efforts, as tools like Ben Fry’s Processing Framework (recently ported from java to javascript, which is much more web friendly, and used extensively in the MOMA’s Design and the Elastic Mind exhibit), will begin to institutionalize the knowledge learned in constructing these maps.

And, of course, all of the code and content used to create these projects should be free and open, so the world can learn and improve on their foundations.

Bruno vs. The Cavemen

This summer I was part of an amazing reading group where we slowed to a crawl and closely read Bruno Latour’s Politics of Nature. When I say we read the book, I mean we literally went around the table and read the book out loud, stopping to discuss difficult passages until we were confident we understood them.

I haven’t taken to the time to read a book this closely in ages, and the experience reinforced the age old addage about finding the universe in a grain of sand. Reading a book that deals with such deep eternal themes, written by a brilliant theoretician who has himself synthesized and integrated an incredible amount of history, philosophy, and literature, was like glimpsing the entire cannon through Latour’s eyes, and well worth the effort.

In this work, Latour performs a root canal on a form of conceptual dualism that has haunted Western thought for millennium. The book revolves around a perplexing circumstance in world we have constructed for ourselves – How did we end up in a world where one set of propositions (usually known as facts) are authoritative, unassailable, and incontrovertible and another set of propositions (usually known as values) are the kinds things we are allowed to argue about?

Apart from the challenge of figuring out which of these flawed categories a particular proposition belongs to, the artificial separation between the tasks of constructing the common world and constructing the common good shuts down all possibility of discourse – before we even get a chance to try to arrive at consensus! The institutionalization of facts and values are so inextricably intertwined that it is folly to erect barriers between these two enterprises.

Latour illustrates his perspective with examples from controversies in the sciences (especially Environmentalism and Political Ecology), but it is trivial to transpose his argument to the great debates between objectivity and subjectivity in Journalism, and the ways that certain kinds of propositions (‘data’ in many conversations about technology, and ‘revelation’ in conversations about religion) are invoked as trump cards to shut down all debate. Medical “science”, especially psychiatry and brain science are horrendous perpetrators of these offenses right now, and the consequences are anything but theoretical. The Onion provides my favorite example illustrating the confusion between facts and values.

Latour’s proposed strategy for re-imagining the mexican standoff between nature/culture, science/democracy, facts/values, objectivity/subjectivity, necessity/freedom, etc is to re-tie a metaphysical Gordian knot as an epistemological one. He would like us to consider an dynamically expanding collective of players/concepts, composed of humans and non-humans (the non-humans have spokespeople, whose assertions are speech acts – qualified by the same kinds of language we use to indicate our confidence in any speech act).

Revisiting and reinterpreting Plato’s metaphor of Cave, Latour traces the West’s tendency to cleanly divide smooth facts from messy values to the flawed idea of aspiring to leave the Cave to grasp/glimpse/experience the Truth. Even if this were attainable, the sojourners would still need to return back into the cave, to mediate and relate their experience to those still trapped within. Instead of aspiring to leave the cave, we need to transcend the entire Cave system.

It isn’t completely fair to criticize a book for what it’s missing (no single book can be all things), but it would be great to expand this line of analysis in the future and elaborate on the role of mediation in the current and imagined collective. It seems pretty clear to me that for Latour, the ‘Sciences’ encompass the entire enterprise of Science, including the scientists, the funders, the corporations, the educators, and the scientific journalists. But, there is little in the book that unpacks these relations.

A broader criticism sets an argument that John Durham Peter’s advanced in Speaking into the Air, against Latour’s conception of the Collective. Peter’s argues that we often view communication as salvation, when in fact alot of discourse never leads to consensus, and there are perspectives that are mutually incommensurate and irreconcilable. I may be naive to think the Collective that Latour dreams of is a realistic aspiration, though I sure would love to live to participate in it.

I also want to explore the connections between this work and the Death of Environmentalism essay I encountered last year. I think Shellenberger and Nordhaus’ argument is a vivid and direct application of the theory Latour argues in The Politics of Nature.

Ulises Mejias’ work on Networked Proximity is another work which might be fascinating to juxtapose with the dynamically expanding collective (which, can be thought about as a network).  Ulises’ notions of the para-nodal might be crucial to consider when the collective invokes the power to take things into account.

The End of Digirati Philosophizing

Chris Anderson, the editor-in-chief of Wired published a provocative essay last week that really caught me off-guard:

The End of Theory: The Data Deluge Makes the Scientific Method Obsolete

I have been writing lately about the effects that technology is having on epistemology, namely, what is knowable and how we go about knowing.

But, I’ve arrived at very different conclusions than Anderson. I think that our methods for gathering evidence to support a hypothesis is changing – radically – but I certainly do not think that the scientific method (or attitude or stance, as Piet Hut sometimes puts it) is obsolete. Evolving, for sure, but I hope not in the direction that Anderson claims. Intriguingly, Kevin Kelly – who originally launched Wired, wrote an essay on the future of science that I think is much more thoughtful and prescient.

A cursory examination of the comments posted on his essay make me wonder if he hasn’t floated a straw man argument, just to be provocative. But after a few conversations with friends and colleges this week, I believe there is something important and scary in his perspective.

My thinking here is greatly informed by a book I am reading this summer by Bruno Latour – The Politics of Nature. In this book, Latour struggles to reconcile the perennial tensions between nature and democracy, science and politics, facts and values, and ultimately, objectivity and subjectivity. He critiques the veneration of facts as the penultimate authority – reminding us to always consider who gathered those facts and why. His argument is far more nuanced and complex, but I really see its re-enactment in the veneration of data Anderson naively concedes.

We must acknowledge that data itself is nothing more than a mediation with reality – and we shouldn’t confuse data with reality itself.  There are many good rebuttals appearing in the comments, but none that I have read point out that Anderson’s characterization denies the politics of instrumentation and data collection – the concepts and constructs that underlie the data, never mind the importance of stories and explanations in our politics and justifications.

This understanding is basic to the psychology of perception as well as the philosophy of science – there is no observation without pre-existing concepts and constructs – the buckets of data we are collecting (and, at least for now, some data is not being collected) are being stored according to organizational schemes – schemes created by humans.

Data isn’t sacred, and its folly to regard it as such. We need our models and the explicit self-awareness that we created them within a particular historical context and theoretical paradigm.

In the wise words of my mentor/advisor, Frank Moretti:

The problem with statistical analysis in the hands of many is that they expect the statiistics to yeild the truth and this leads into the mistake “reporting their findings” in a theory-deprived context. Whenever you are dealing with the human sciences, whether the information is statistical, visual or otherwise, you still have to build a meaningful narrative that requires that you have a point of view that has either overt or covert theoretical assumptions. Without that you are in danger of reporting your views in what Marcuse calls opreational language, a language derived from the tools of discovery rather a serious point of view.

Tigers and Teachers

Last week I went back to ‘ol Nassau and attended the annual New Media Consortium conference, held this year at my alma mater.

The conference was very engaging, especially since I don’t think I have ever attended an event geared specifically towards the kind of work we do at CCNMTL. Typically, whether its developer, librarian, technorati, activist, or academically oriented, our work shares aspects with other attendees, but usually not a similar overarching mission. I was reminded how special our organization’s niche is – we should take pride in our projects and values. I also gained a better understanding of how privileged our situation is.

While no two university’s I have ever encountered share the same organizational structure, many now support groups whose primary mission is helping the faculty use new media & technology purposefully. I was astounded at the constraints, and corresponding resourcefulness, these groups exhibit. Most of them have a much smaller staff than ours, and very few actually develop custom software. A WordPress or Mediawiki plugin is about as complicated as many of them can attempt. And yet, they forge ahead, scraping together whatever tools they can wrap their minds around – and in the era of mashups, the possibilities are growing daily.

It is interesting to contrast this resourcefulness with corporate, and even non-profit, technical efforts I have been involved with. Many of these groups have gourmet taste in technology, and initiatives are often paralyzed until the right tools are developed. The educators show how far a healthy culture of use can go in trumping system constraints.

Overall, many groups are still working with the faculty to get beyond the allure of the media, and demand a greater educational return than “mere” excitement and motivation. Critical engagement must go beyond supplemental materials, as it is decidely difficult to follow through on the promise of a demonstrated educational value. There were many projects that clearly helped the students feel good about their learning, but it is incredibly hard to design a curriculum where these new media objects become a central component in a student’s analysis. In our work we try, and occasionally succeed, to help push the faculty to design assignments where the new media elements are an integral part of the critical analysis – where the learners deeply engage with the media, and bring these elements into play as evidence in support of an argument.

These aspirations place the bar quite high, and often require faculty to develop an radically new teaching style. Additionally, none of us learned this way, though we all seem to be convinced these new styles are superior to the ways we were taught. Consequently, there is a great deal of experimentation and research involved in educational technology. It was really great having these kinds of conversations all weekend long – sharing and exchanging perspectives with the others grappling with similar concerns.

Some of the highlights I learned about included:

  • Sun’s Wonderland Virtual World – a free-software, enterprise/education-ready virtual world environment, with more of a professional emphasis than Second Life. Of Sun’s 34k employees, 50% or more work remotely or from home on any given day, so collaboration tools are very important for them. The environment supports authentication, allows for any X window to be shared w/in the world, and even has telephony bridging, so users without a client can call in.
  • Emerson’s NEA funded Digital Lyceum Project where New Media scholars Eric Gordon, John Freeman, and Aubree Lawrence are investigating the orchestration of attention during a live event. Research like this could help the backchannel transition from distracting to essential – its fun to imagine being able to cite or reference the flurry of associations, chats, and google jockeying that flow by in the stream of consciousness that live events have become.
  • The John Lennon Educational Tour Bus – Wow. Imagine this media studio on wheels pulling up to your school when you were a kid. Three hipster musician/media-mavens tour the country on this bus, sponsored and outfitted by the likes of Apple and Sony – they are rock stars without the responsibility of performing. Students on the bus come aboard without any specific skills, and leave with something they made that day. The bus sports two fully outfitted media workstations, instruments, and even a green room. Buses like this represent an incredible amount of potential, helping students understand they can produce as easily as consume.

I hope in the years to come the bus incorporates a few more Media Fluency lessons (think: MacArthur’s Digital Learning Initiative, John Broughton’s Pop Resources and David Buckingham’s Journal of Learning and Media) at touchstone moments (“Ah! so all media produced incorporates the producers perspective”), a few more lessons on the ethics of sharing (“Hey, how do I share my media with the world, and let others remix it?”), and offer concrete strategies for continuity after the bus pulls away (“I get it – all media is produced on magic buses”)…

Many NMC’ers have drank deeply at the fountain of Second Life kool-aid, and I glimpsed more variations on the educational potential of Virtual Worlds. I didn’t hear too many people riffing on the centrality of realistic memories the environment offers, so this is an idea I certainly need to develop further. I am immensely grateful to the Play As Being community for introducing me to these experiences in a very meaningful context.

Finally, I spent lots of time reminiscing about my undergraduate years. My colleges and I cracked secret codes, narrowly averted an attack by a giant tiger, revisited the Princeton Record Exchange (where I spent $20 and came home w/ 6 cds), and lamented the campus’ new density – a building has sprung up in almost every open space I remember.


No more pencils…

Well, summer vacation is finally upon me – now I only need to work fulltime.

My first year in my PhD program I found myself thinking alot about methods. Not all that surprising, given that one day I will have to defend my methods along with my ideas, but a pretty abstract space to be preoccupied with, nonetheless.

This spring I wrote a paper about all the techniques that the Social Sciences really need to be borrowing from industry and the hard sciences:

where I basically finally cashed the promisary note I scribbled 2 years ago. While it was an effort to write, looking back I am glad this now exists, and I really do understand the argument much better than when I started writing it. This is reassuring, since I keenly aware of how difficult it is to capture people’s attention, and much of my writing will likely go unread.  (I think this peice goes well w/ the Fall’s Out of Thin Air: Metaphor, Imagination, and Design in Communications Studies).

Along the way I also created a little lesson plan around Nirvana’s Lithium & The Abilify Commercial for the Teach, Think, Play weekend workshop with David Buckingham. And, I presented the ZyprexaKills Campaign (slides, paper) in London at the Politics: Web 2.0 confernce.


Mirror, Mirror On the Screen

It’s been a few weeks since I first started experimenting with the Play As Being practice, and ventured into Second Life. I continue to appreciate the performative brilliance of utilizing Second Life as a means to study the nature of consciousness, being, and reality. I am starting to imagine a metaphysical syllabus that incorporates virtual world immersion as an instrument for laying bare the everyday assumptions we make about consensual reality.

While I am learning something about myself as I project my identity into my avatar (its almost impossible not to, as veteran SL’ers will attest), I am also learning more about this world, and its seductive attraction. Lots of Second Lifers believe that Second Life is just as real as Real Life (which, for mystics might just mean that both are illusory), but I lean more towards the cautious opinion that Second Life is a mirror, albeit one with a great deal of depth.

Mirrors are quite magical and wonderful (7 years of altered luck, and all that). They can be used to see far and deep — think reflecting telescopes or the michaelson-morely experiments — but they have also trapped a fair share of narcissuses in their alluring reflections. So does SL represent the vanity of vanities? Maybe not, but considering that the energy consumption of a typical SL avatar now exceeds the energy consumption of an average real world brazillian, it is important that folks consider their time in SL well spent.

One upside of my recent journeys is that I now appreciate the research going on in this area much better. Here are two pieces from the Chronicle of Higher Ed reporting on research going on at Stanford’s Virtual Human Interactions Lab:

The claim that a user’s avatar imprints so strongly on their psyche is much easier for me to understand after spending some time in Second Life. I would have been far more skeptical of these findings if I hadn’t experienced the power of this medium first hand.

These findings and experiences really helped me imagine the potential impact of projects like Virtual Guantanamo (which I haven’t personally visited yet). I can say, that when I stumbled across the Virtual World Trade Center I found the location distinctly eerie and spooky. Apparently I’m not alone, as the virtual storefronts on the groundfloor are vacant here too. And, as I learned recently at a symposium at the Fashion Institute of Technology, SL is an ideal environment for teaching fashion and design. While SL has its share of casinos and lap dances, places like Rieul’s Zen Garden and the Interfaith Gardens show a real diversity of interest, consistent with the proposition of SL as a mirror.

As for the core experiment, sprinkling the pixie dust of reflection and contemplation throughout my day, I continue to be impressed by how malleable my awareness can be. In Pema’s words: “repetition is a powerful thing.” Over the past few weeks I have also enjoyed poking holes in reality while at the movies and travelling to foreign countries. Ideas we have been repeating and playing with regularly in Dakini’s lovely Rieul teahouse.

Solstice Special

moonmars_071127_harms800.jpgI haven’t posted much here lately, but I have been writing. I just finished my first semester as a doctoral student in the Journalism school and completed a flurry of term papers.

These two are from my pro-seminar with Michael Schudson, a class meant to introduce us to the history of the field and the faculty in the program. Our final assignment was to identify gaps in the field, which is a tough one, as all non-existence proofs are — especially in an interdisciplinary field, there will always be a fringe element occupying the gap.

People in the class interpreted the assignment in two ways — some chose to identify gaps, while other actually went out and tried to fill some. I took the opportunity to begin to pre-emptively answer the question I am sure to be challenged with in the years ahead – the ever-daunting methodolgical quetsion — what on earth am I doing and how am I am doing it?

Out of Thin Air: Metaphor, Imagination, and Design in Communication Studies

(and this was the midterm paper which got me thinking in this direction Transcending Tradition: America and the Philosophers of Communication).

I also took a wonderful class this semester at the New School taught by Paolo Carpignano (The Political Economy of Media – here is the syllabus). The class was all about the shifting relations between fabrication and communication, or more colloquially, work and play. We opened with Marx and Arendt and closed with Benkler and boyd. I took the opportunity to capture some of my experiences working on the Plone project before they fade from memory.

Fabricating Freedom: Free Software Developers at Work and Play

I am really glad to be done with the semester and am looking forward to a few weeks of “just” working full time!

Crowded Wisdom

This week I saw a presentation given by a member of the Yahoo!/Berkeley research team.

At the talk, Dr. Naaman demoed this unassuming tool that his group has been working on:

TagMaps (live demo, description)

I am really glad I went to the talk, since the demo helped me understand how sophisticated this tool really is. I had a definite ah-ha moment learning about all the new flavors of semantic information soon to be mined from the massive amounts of memories we are collectively recording.

During the talk I was reminded of this recent essay on Evolution and the Wisdom of the Crowds which explains how counter-intuitive these emergent properties are to our everyday experience. But, this seemingly teleological construction of semantic knowledge naturally emerges from a rich enough system, as the flickr research demonstrates.

To clarify what you are looking at here, no humans tuned or trained the system to teach it which are the significant landmarks in these regions. The representation is computed using the aggregate processing of many, many tags. These tags are starting to provide enough information to disambiguate different senses of a word (based on the adjacent tags that are also present). Patterns are also discernible from the spatial-temporal information on these photos, and yearly events (e.g. BYOBW) have been detected and recognized by the system. Formerly unanswerable questions, like “What are the boundaries of the Lower East Side?”, now have a fuzzy answer of a sort, in the form of collective voting.

While the UI work here is neat, it pales in comparison to this Jaw-dropping Photosynth demo presented at TED this year (though it does beat the pants of the current UI of pink dots on a map which forces you to paginate over all the matching pictures in batches of 20). The widget is even available as web service which you can feed your own data into.

But, the real work here is going on behind the scenes. It’s being published and presented in CS contexts, just in case anyone thought this “social media” stuff was for just for kids.

How flickr helps us make sense of the world: context and content in community-contributed media collections

There is certainly lots to digest here. It’s one thing for an algorithm to decide on the most representative photographs of the Brooklyn Bridge essentially based on popularity (though its a shame that avat-garde art photos will be automatically marginalized through this technique), but its quite another to imagine other important areas of discourse being regressed to the mean – its an odd sort of leveling effect that is likely another manifestation of Jaron Laniers’ Digital Maoism.

The presenter did note that social media designers do need to anticipate feedback effects, as when they launch a new tool and users adjust to the new conditions and modify their behavior accordingly (or begin to “game” the system to take advantage of it).

We are a long way from 1960’s AI and its conviction that the world is best modeled and represented as a series of explicit propositions.

Pedagogical Sofware

Literally. See my post on The Plone Blog:

Plone University

Previous PageNext Page
/* reset the net - */