Wednesday, December 30, 2009

The Decade as Farce and Tragedy

For Music, the twenty-first century began in 1999. Jordan Knight, one of the spry, ingratiating kids from the by then odious New Kids on the Block, was attempting to revive his career, and in doing so, notes Tom Ewing of Pitchfork, “pointed most clearly at the future.” Ewing’s thesis is that “the worlds of pop, R&B and mainstream hip-hop were moving ever closer together, and a generation of stars, songwriters, and producers recognized an opportunity to reinvent how pop music sounded—and who listened to it.”

If pop was about to appropriate a new partner in crime and commerce, rock was about enter its lean years. In a final moment of desperation critics fell over themselves to anoint the New York City band The Strokes the new gods of Rock, but their ironically titled 2001 debut This is it? was only the start of a long argument.

A battle that pitted on the one side defenders of Rock n’ Roll, the Rockists, and on the other the children of 90’s popular music, the Popists, would change the terms of critical discourse. Soon, you could listen to pop music guilt free. Meanwhile, urban culture became mainstream culture, and white newscasters didn't blush when casually punctuating their telecasts with "bling" or "pimped-out!"

Then a plane flew into a building. The ensuing political vacuum led “the international community” to look to the West, where for much of the decade the only thing they could hear was a huge sucking sound. This lack of leadership infected global politics, a virus that most of the developed world contracted.

Parliamentary and legislative deadlock turned natural ruling parties into incoherent behemoths in Japan and Canada. While trying to join in economic and political union European countries were fighting false wars to reclaim national identities that had already shifted. The “Muslim Problem” was, and is, primarily about a sclerotic welfare state coming to grips with a rate of diminishing returns. Economies weren’t growing fast enough or equitably distributing their gains. People were growing older, and godless -- so Islam seemed very much that rough beast.

And then there were wars. A cosmic struggle over good and evil was irresistible to an unlikely alliance of idealist on both the left and the right, and religious warriors in Central Asia were all too pleased to fight it. Seen more rationally, internal disputes within the Middle East and Central Asia, framed as world-historical in nature, were merely political and geographic squabbles over power.

Further East, India grew, and even further East, China ballooned. Perhaps illusory, China’s economic growth may turn out to be more of a problem for China, and the East in general, than for the world. But China as an ideological competitor, an alternate reality for managerial statecraft, will be one more contingency that hunts Francis Fukuyama. History goes on.

The climate was changing, only a lot faster. Green became an ideology, an industry, an epithet. For the poor there was a lot of cheap, fatty food, and obesity. For the well-off there was moral superiority of eating better. Will and Grace was popular, and then gay marriage.

Men embraced extended adolescence, ultimate fighting, and poorer employment prospects; thankfully there were older women, "Cougars," ready to support their low ambitions and nurse their bruised egos. Because it was making less and less sense, there were a lot of commercials about what it meant to be a man. They didn't help.

People started watching themselves on television, and then started singing on television, and then started dancing on television. People also stopped watching television.They stopped reading newspapers and talking on landlines. What they started doing was, well, texting and blogging. They started looking at pornography on the Internet, perhaps one of the main reasons the Internet grew so quickly, and they started stealing. People stole music, and then they stole movies. It got so crazy that people even started stealing the Internet.

If 10,000 songs on your iPod seemed inconceivable in 2001, 100,000 applications on your iPhone, only nine years later, seemed excessive. There will be a whole generation of children who won’t know what a CD is. In the meantime, phones got smarter and people began to fear that Google was making us stupider. A whole industry sprung up around the idea that, somehow, culture was being “dumbed down,” becoming more “sensationalized." We ate it up.

If you were paying attention to the social scientists they assured us we were getting smarter. Yochai Benkler noticed that we were paying more attention to each other, had more opportunities to criticize and collaborate, were ultimately sharing more and expanding our cognitive capabilities through larger networks. And then we started talking about networks. The feedback loop was getting tighter, louder.

For the closet Marxist this was an opening. Neoliberal institutions began to falter. Corporate fraud was the "new black." An inflation of information wound up devaluing both the information itself and the public figures that produced it. Radical skepticism was the default position and social trust and cohesion began to erode.

There was a tsunami, some earthquakes, and a hurricane. Suddenly the world was a dystopian nightmare. The African continent, in a word, regressed. Former leaders of the independence movements neglected to appreciate Lord Acton maxim of absolute power corrupting absolutely. A counterweight to the world’s industrial powers, the BRIC, the g-77, started speaking with a unified voice after Doha.

A black man became leader of the free world, while the last icon of popular culture died. Our guide through the farce and tragedy of the last decade was a caustic Jewish man with a half hour show. From time to time he would disrupt the prevailing narrative, forcing media organizations to tacitly acknowledge their ridiculousness. One news organization wore its ridiculousness as a badge of honor and became the obnoxious mouthpiece of a misguided, though perhaps well intentioned, president. Because we were dispossessed, power so completely a function of larger interests, entertainment was news and news was entertainment. It wasn’t a stretch for TMZ to become a legitimate news organization.

Just as the beginning of the decade started with the crash, our collective entrepreneurial imagination far out pacing objective reality, the decade ends with a spectacular financial crash. Again our imagination, through mind-numbingly confused financial innovation, far outpaced material reality. Not only could you turn debt into a financial instrument, you could turn that financial instrument into thousands of other pieces of financial instruments. You could trade stocks that you didn’t own. In fact, you still can. Think about that for a moment.

So it was a little cute when former chairman of the Federal Reserve Alan Greenspan announced that he had identified “a flaw” in the system. Who knew self-interest wasn’t reason enough for financial firms to regulate themselves? Who Knew! You would only had to have been alive in 1998 to internalize that chestnut, which Mr. Greenspan, in fact, was.

How much risk were you willing to take on? If your model for tough-mindedness, discipline and hard work was Tiger Woods, who for many it was, then there was an unpleasant parallel that capped off the “Aughts.” Choose your colorful pun, but ultimately we all took excessive risks. In short, we were all too human.

So what points to the future today? One word: Women. A subtle and under-appreciated shift in gender dynamics pervaded much of the decade. You could see this in labor market trends and educational attainment. Women started winning all of the careerist foot races and decided to start their lives before the golden seal of marriage. Ms. Bradshaw was a lagging indicator of a lager trend.

The old school Feminist resented the new school, unwilling to accept that family and children or career was actually a false choice. It turns out that there was a heterogeneity of things that women could want, and do, and not feel lesser. What does this mean? First, forget the work week. Forget the office. Women began to prove that productivity and efficiency didn't only exist in company's org chart. And while the computer geeks laid the foundation for the next phase of global economics, it appears that women may be its key architects. World, hold on.

Monday, December 14, 2009

The New Geography

One summer weekend in the not too distant past, my girlfriend at the time was traveling down from Ottawa to visit me in Waterloo, Ontario, where I was living. I was really excited to see her, though much less so to be meeting her parents for the first time.

They would be flying in from a small town in Northern Alberta, to meet in her Ottawa, before driving down to Waterloo. A last minute change of plans, though, would make me even more nervous. Now, only her father and her would be coming down to visit. After furiously cleaning my place that Friday evening in preparation for their arrival, I sat on my couch and stared at the ceiling.

The introduction, thankfully, turned out to be less painful than I was expecting. By the end of the evening the three of us were silently working away on a puzzle, a wise suggestion, I later found out, by my girlfriend’s mother. I would sleep on the couch and the next day we would all go to Marine Land.

Before I closed my eyes to go to sleep that evening, I thought it would be a good idea to look at a map, just to be clear on directions. The thought passed as quickly as it came, and, instead, I went to sleep.

The next morning we all loaded into the car and set off for Marine Land. My girlfriend gave me a big approving smile and a wink, signaling, I thought, that I was making a good impression.

An hour later we were going in the wrong direction.

Neither my girlfriend nor her father were familiar with Ontario highways, and it seemed obvious that I wasn’t either. Consulting and wrestling with my map from time to time, I could feel the weight of their gazes on me, the summer heat pinching at my skin. Marine Land may as well have been on the other side of the world.

A confluence of things has made what happened to me that summer day become less and less common. Not only has the ubiquity of geographic information systems like GPS increased our ability to locate and orient ourselves with much greater ease, there has been a broader shift in the way we interact with our public environments.

When Google Earth was first released in 2005, it gave users a new desktop tool for procrastination. The sheer novelty and wonder of being able span the globe with the click of a mouse was thrilling. In one smooth glide you could fly over the Serengeti desert and find yourself looking down on Tokyo’s densely layered wards. A virtual, worldwide jaunt from the comfort of your own home should not have immediately struck anyone as ground breaking, but with the introduction of fined grained data, and a lot of it, that would soon change.

A new wealth of information is reshaping nearly all of our daily activities. Observing and analyzing the way we use energy has helped us use energy more efficiently. By closely tracking infections and diseases we’ve been able to identify epidemiological patterns and as a result reduce the rates of contraction and infection. Indeed, the threat of H1N1 has remained just that, a threat, much like Avian Flu and West Nile virus.

This wealth of information has also led to a variety of new types of conceptual mapping, ones that are showing us different ways of looking at information within public environments. But, more importantly, the scale of both the information and new technology has become inverted in two significant ways.

First, the type of technology used to map and process this information has become smaller. Whereas in the past mapping applications were vast, unwieldy and proprietary, more likely used for large scale industrial and commercial enterprises, today relatively inexpensive and powerful computers, advanced telephony, digital compasses, and optical camera sensors have made these applications readily available for personal use. The second inversion involves the information itself.

The simplicity and functionality of new smartphone technology has paved the way for an explosion of social input. Not only are governmental and commercial entities adding to this growing wealth of data, more fundamentally, the gathering of information has been opened up to anyone with the interest and the initiative. In the process, the rough and tumble of daily life has become far more richer and granularly elaborate.

Perhaps the best example of this is the usefulness of an application like Google Maps, where users are able to input a desired location and receive the distance, time, as well as the best possible routes for future trips. That convenience, once only the preserve of desktop computers, has migrated onto smaller, handheld technologies. Even then, getting directions is one thing. People have begun to use their cell phones and smartphones to draw from the layers of relevant and oftentimes interesting information once they reach their destination, opening up an opportunity for a whole new immersive experience.

A number of new smartphone applications are utilizing a software platform called Augmented Reality to enrich the users experience with public environments. Augmented Reality is the idea that layers of useful information can be embedded in and added to the public spaces we encounter everyday. Imagine placing the camera of your phone down a busy street in a downtown core and being able to receive a visual output of all of the available information. There could be a sale down the street at a bookstore you’ve never heard of, or a talk by the chairman of Infotronic on the third floor of that building on your right. Imagine getting a notification that part of your 401k or retirement savings were tied up in Infotronic. Suddenly, the environment has much more salience to you than you had ever expected. Instead of seeking out this information, a much more labor-intensive task, it may already be hanging in clouds of data all around you, ready to be discovered.

Layar, touted as the world’s first Augmented Reality browser, is a new software application that attempts to integrate any available commercial or public information with geographic tags in your proximity. In effect, users are given an encyclopedic window onto the contours of their landscape through their mobile phones.

Currently, Layar is available through Android, Google’s mobile operating platform, and the iPhone. Although Apple’s mobile operating system boasts a wide variety of applications, Android’s open-source model has been much quicker at adapting the contributions of developers and users into expanding the functionality of various mobile applications like AR. Juniper Research recently reported that by 2014 services supported by AR could total as much as $732 million in revenue, establishing much of their infrastructure from overlapping databases of crowd-sourced and geo-tagged information.

A recent New York Times article described the growing trend of harnessing crowd-sourced information in developing mapping applications. One non-profit, OpenStreetMap, organizes groups of volunteers in different cities throughout the world to help collect geographic data, slowly filling up informational gaps where they may exist. Much like Wikipedia, OpenStreetMap’s goal is to make maps freer and more accessible to the general public, without the restrictive licenses that many commercial firms apply to their own proprietary maps.

Always at the vanguard of tapping into the potential for social production, Google developed Map Maker in 2008 as a collaborative tool that allows users to construct their own regional maps. By layering this crowd-sourced information onto their existing databases, Google is scaling large amounts of information in a largely decentralized though shrewdly managed process.

Even as some express concerns about the reliability and quality of user generated maps, technology writer and blogger Tim Lee suggests that, “like any disruptive technology, the initial use of these maps probably won't be ones that directly compete with incumbents.” Mr. Lee’s writing has largely been concerned with the inflexibility of large, hierarchical organizations and their management styles. He regards Silicon Valley as a signal example for the way in which information should move within institutions, where ideas and information are openly shared and collaborative production is valued as a practice and not merely as a slogan.

Mr. Lee went on to add, “to be successful, these mapping products will need to find niches that the incumbents aren't serving.” For organizations with flexible management styles, there is an opportunity for the types of bottom-up innovation that large commercial firms are too slow-footed to capitalize on, since in many cases crowed-sourced maps are much quicker at giving real time information.

OpenStreetMap 2008: A Year of Edits from ItoWorld on Vimeo.

Still, the notion that only commercially useful information enriches social life is a thin conception of citizenship. Another emerging feature a number of these new mapping applications is facilitating is a renewed sense of civic spiritedness. With the ability to access a wide variety of public information, calls for transparency and accountability are starting to ring more clearly, which in turn is altering the dynamic between public representatives and the citizens they serve.

Recently, Sunlight Labs, a Washington based non-profit aiming to digitize government data, developed an augmented reality enabled software application that gives interested citizens and public advocacy groups access to the location of government infrastructure disbursements. The application is currently being used to track the progress of the $787 billion dollar stimulus bill in the US.

Earlier this month Sunlight Labs saw a few of its recommendations included into the White House’s Open Government Directive, an administration initiative that will require executive departments and agencies to adhere to new principles of transparency, participation, and collaboration. Departments and agencies will now have to make a wide variety of their documents publicly available online and easy to search through. This may be one way, perhaps, of reducing the difficulty of navigating through complex public information, one of the major causes of public apathy.

Yet even when more transparency forces public officials to fully account for their activities, there will inevitably be areas of civic concern that large bureaucracies won’t be able to efficiently manage. And where public or private institutions are either unable or unwilling to effectively intervene, the pervasiveness of group forming technologies and applications are allowing citizens to intervene on their own behalf.

Clay Shirky, new media thinker and author of Here Comes Everybody: The Power of Organizing Without Organizations, has described how the access to inexpensive technology has had a fundamental effect on social behavior. Shriky sees sharing and collaboration as key ingredients in resolving the perennially difficult problem of broad based collective action, particularly when it arises outside of traditional institutions and organizational structures. In talks, Shirky has given crime wikis as an example of how group-forming civic initiatives can, if only provisionally, deal with collective action problems.

When Brazilian police were unwilling to publish complete geographic crime data in some high-crime areas, leading to the under-reporting of crimes by victims, Professor Vasco Furtado of the University of Fortaleza created WikiCrimes, a geographic database where victims can anonymously post the location and description of a crime they suffered. One of the challenges WikiCrimes faces, however, is the credibility of user reports.

In email, Professor Furtado said Wikicrimes tries to ensure the reliability of its information by analyzing the reputation of its user through their interactions with other users. Even though a number of press organizations and governmental agencies have been recognized as respected certifiers, Furtado acknowledges that “the attribution of reputation to the users who are not qualified as certifier entities is fundamental.” He believes that this makes the system as a whole more reliable.

Today, WikiCrimes has over 6,000 users, %20 percent of which have registered crimes. Its existence, Furtado noted, has forced some state governors to start publishing crime data online. And while law enforcement is still reluctant to work with WikiCrimes, Furtado says that the public debate over the application’s usefulness is a big cultural change. He hopes that in the future law enforcement will be able to see Wikicrimes as a complimentary tool in helping to fight crime.

French historian Alexis de Tocqueville once observed that “knowledge of how to combine is the mother of all other forms of knowledge; on its progress depends that of all the others.” In a culture where the integration of diverse types of information is leading to innovate software platforms and inventive social and commercial applications, the ability to combine existing forms of public knowledge is a crucial component for social and material prosperity.

The wisdom of crowds and the robustness of new collaborative social networks have allowed citizens to engage in broader civic and commercial initiatives, helping to stitch together the patchwork of rich and complex databases all around the world. What used to be asymmetries of information, where the general public had very little access to sources of public knowledge, have now become wider distributions of collective intelligence. Many hands have no doubt made the work lighter.

Many of these positives aspects, though, have unintended consequences. One pernicious effect is the yawning privacy and security gaps that now push the onus of protection onto each user. People are being forced to be more cautious with the types of information they volunteer and savvier in the ways they manage their online personas. Another troubling effect is the ability of large organizations to monetize the free work of eager contributors, a notion some are beginning to refer to as the "Hobby Economy."

The challenge for the citizens, policy-makers and private and public institutions will be to adapt their rules and laws to address these issues. But the broader implications are promising: technologies and applications that have ability to scale the complexity of an astonishing amount of social data, and the potential for users to optimize on-the-spot decision-making and learning.

In essence, geographic data is more than just about a particular place, it is about the character and richness of the narrative that individuals invest in that place. Clusters of local knowledge are now being augmented by the idiosyncrasies of groups of citizens, adding new layers of context, from video, text, audio, and images, on top of old ones. Perhaps we’ve entered an age where search as a platform has lead to discovery as a window, and that by looking we can know not only where we are, but also what we want to do next.

For the next hour we drove around aimlessly. I could sense that my girlfriend was burning with embarrassment and anger. Very little was said. When we finally decided to drive back to my place, I was expecting the inevitable questions. Didn’t you have any idea where Marine Land was? Do you know how to read a map? At the time, my competence was definitely open to scrutiny. But somehow, none of these questions arose.

That afternoon we ate lunch in silence. My girlfriend’s father finally broke the tension by saying he thought Marine Land was ridiculous, and that it was better that we never made it there at all. I appreciated the gesture. That evening we worked on another puzzle together and joked about how silly the day had been, how tense and irritable we all were.

That night before I went sleep I looked over the map. I closed my eyes and tried to think of were we could have gone wrong. I put the map down and looked up at the celling. It occurred to me, finally, that we could have just asked someone for directions. That would have been much easier.