For Music, the twenty-first century began in 1999. Jordan Knight, one of the spry, ingratiating kids from the by then odious New Kids on the Block, was attempting to revive his career, and in doing so, notes Tom Ewing of Pitchfork, “pointed most clearly at the future.” Ewing’s thesis is that “the worlds of pop, R&B and mainstream hip-hop were moving ever closer together, and a generation of stars, songwriters, and producers recognized an opportunity to reinvent how pop music sounded—and who listened to it.”
If pop was about to appropriate a new partner in crime and commerce, rock was about enter its lean years. In a final moment of desperation critics fell over themselves to anoint the New York City band The Strokes the new gods of Rock, but their ironically titled 2001 debut This is it? was only the start of a long argument.
A battle that pitted on the one side defenders of Rock n’ Roll, the Rockists, and on the other the children of 90’s popular music, the Popists, would change the terms of critical discourse. Soon, you could listen to pop music guilt free. Meanwhile, urban culture became mainstream culture, and white newscasters didn't blush when casually punctuating their telecasts with "bling" or "pimped-out!"
Then a plane flew into a building. The ensuing political vacuum led “the international community” to look to the West, where for much of the decade the only thing they could hear was a huge sucking sound. This lack of leadership infected global politics, a virus that most of the developed world contracted.
Parliamentary and legislative deadlock turned natural ruling parties into incoherent behemoths in Japan and Canada. While trying to join in economic and political union European countries were fighting false wars to reclaim national identities that had already shifted. The “Muslim Problem” was, and is, primarily about a sclerotic welfare state coming to grips with a rate of diminishing returns. Economies weren’t growing fast enough or equitably distributing their gains. People were growing older, and godless -- so Islam seemed very much that rough beast.
And then there were wars. A cosmic struggle over good and evil was irresistible to an unlikely alliance of idealist on both the left and the right, and religious warriors in Central Asia were all too pleased to fight it. Seen more rationally, internal disputes within the Middle East and Central Asia, framed as world-historical in nature, were merely political and geographic squabbles over power.
Further East, India grew, and even further East, China ballooned. Perhaps illusory, China’s economic growth may turn out to be more of a problem for China, and the East in general, than for the world. But China as an ideological competitor, an alternate reality for managerial statecraft, will be one more contingency that hunts Francis Fukuyama. History goes on.
The climate was changing, only a lot faster. Green became an ideology, an industry, an epithet. For the poor there was a lot of cheap, fatty food, and obesity. For the well-off there was moral superiority of eating better. Will and Grace was popular, and then gay marriage.
Men embraced extended adolescence, ultimate fighting, and poorer employment prospects; thankfully there were older women, "Cougars," ready to support their low ambitions and nurse their bruised egos. Because it was making less and less sense, there were a lot of commercials about what it meant to be a man. They didn't help.
People started watching themselves on television, and then started singing on television, and then started dancing on television. People also stopped watching television.They stopped reading newspapers and talking on landlines. What they started doing was, well, texting and blogging. They started looking at pornography on the Internet, perhaps one of the main reasons the Internet grew so quickly, and they started stealing. People stole music, and then they stole movies. It got so crazy that people even started stealing the Internet.
If 10,000 songs on your iPod seemed inconceivable in 2001, 100,000 applications on your iPhone, only nine years later, seemed excessive. There will be a whole generation of children who won’t know what a CD is. In the meantime, phones got smarter and people began to fear that Google was making us stupider. A whole industry sprung up around the idea that, somehow, culture was being “dumbed down,” becoming more “sensationalized." We ate it up.
If you were paying attention to the social scientists they assured us we were getting smarter. Yochai Benkler noticed that we were paying more attention to each other, had more opportunities to criticize and collaborate, were ultimately sharing more and expanding our cognitive capabilities through larger networks. And then we started talking about networks. The feedback loop was getting tighter, louder.
For the closet Marxist this was an opening. Neoliberal institutions began to falter. Corporate fraud was the "new black." An inflation of information wound up devaluing both the information itself and the public figures that produced it. Radical skepticism was the default position and social trust and cohesion began to erode.
There was a tsunami, some earthquakes, and a hurricane. Suddenly the world was a dystopian nightmare. The African continent, in a word, regressed. Former leaders of the independence movements neglected to appreciate Lord Acton maxim of absolute power corrupting absolutely. A counterweight to the world’s industrial powers, the BRIC, the g-77, started speaking with a unified voice after Doha.
A black man became leader of the free world, while the last icon of popular culture died. Our guide through the farce and tragedy of the last decade was a caustic Jewish man with a half hour show. From time to time he would disrupt the prevailing narrative, forcing media organizations to tacitly acknowledge their ridiculousness. One news organization wore its ridiculousness as a badge of honor and became the obnoxious mouthpiece of a misguided, though perhaps well intentioned, president. Because we were dispossessed, power so completely a function of larger interests, entertainment was news and news was entertainment. It wasn’t a stretch for TMZ to become a legitimate news organization.
Just as the beginning of the decade started with the dot.com crash, our collective entrepreneurial imagination far out pacing objective reality, the decade ends with a spectacular financial crash. Again our imagination, through mind-numbingly confused financial innovation, far outpaced material reality. Not only could you turn debt into a financial instrument, you could turn that financial instrument into thousands of other pieces of financial instruments. You could trade stocks that you didn’t own. In fact, you still can. Think about that for a moment.
So it was a little cute when former chairman of the Federal Reserve Alan Greenspan announced that he had identified “a flaw” in the system. Who knew self-interest wasn’t reason enough for financial firms to regulate themselves? Who Knew! You would only had to have been alive in 1998 to internalize that chestnut, which Mr. Greenspan, in fact, was.
How much risk were you willing to take on? If your model for tough-mindedness, discipline and hard work was Tiger Woods, who for many it was, then there was an unpleasant parallel that capped off the “Aughts.” Choose your colorful pun, but ultimately we all took excessive risks. In short, we were all too human.
So what points to the future today? One word: Women. A subtle and under-appreciated shift in gender dynamics pervaded much of the decade. You could see this in labor market trends and educational attainment. Women started winning all of the careerist foot races and decided to start their lives before the golden seal of marriage. Ms. Bradshaw was a lagging indicator of a lager trend.
The old school Feminist resented the new school, unwilling to accept that family and children or career was actually a false choice. It turns out that there was a heterogeneity of things that women could want, and do, and not feel lesser. What does this mean? First, forget the work week. Forget the office. Women began to prove that productivity and efficiency didn't only exist in company's org chart. And while the computer geeks laid the foundation for the next phase of global economics, it appears that women may be its key architects. World, hold on.
Wednesday, December 30, 2009
Monday, December 14, 2009
The New Geography
One summer weekend in the not too distant past, my girlfriend at the time was traveling down from Ottawa to visit me in Waterloo, Ontario, where I was living. I was really excited to see her, though much less so to be meeting her parents for the first time.
They would be flying in from a small town in Northern Alberta, to meet in her Ottawa, before driving down to Waterloo. A last minute change of plans, though, would make me even more nervous. Now, only her father and her would be coming down to visit. After furiously cleaning my place that Friday evening in preparation for their arrival, I sat on my couch and stared at the ceiling.
The introduction, thankfully, turned out to be less painful than I was expecting. By the end of the evening the three of us were silently working away on a puzzle, a wise suggestion, I later found out, by my girlfriend’s mother. I would sleep on the couch and the next day we would all go to Marine Land.
Before I closed my eyes to go to sleep that evening, I thought it would be a good idea to look at a map, just to be clear on directions. The thought passed as quickly as it came, and, instead, I went to sleep.
The next morning we all loaded into the car and set off for Marine Land. My girlfriend gave me a big approving smile and a wink, signaling, I thought, that I was making a good impression.
An hour later we were going in the wrong direction.
Neither my girlfriend nor her father were familiar with Ontario highways, and it seemed obvious that I wasn’t either. Consulting and wrestling with my map from time to time, I could feel the weight of their gazes on me, the summer heat pinching at my skin. Marine Land may as well have been on the other side of the world.
A confluence of things has made what happened to me that summer day become less and less common. Not only has the ubiquity of geographic information systems like GPS increased our ability to locate and orient ourselves with much greater ease, there has been a broader shift in the way we interact with our public environments.
When Google Earth was first released in 2005, it gave users a new desktop tool for procrastination. The sheer novelty and wonder of being able span the globe with the click of a mouse was thrilling. In one smooth glide you could fly over the Serengeti desert and find yourself looking down on Tokyo’s densely layered wards. A virtual, worldwide jaunt from the comfort of your own home should not have immediately struck anyone as ground breaking, but with the introduction of fined grained data, and a lot of it, that would soon change.
A new wealth of information is reshaping nearly all of our daily activities. Observing and analyzing the way we use energy has helped us use energy more efficiently. By closely tracking infections and diseases we’ve been able to identify epidemiological patterns and as a result reduce the rates of contraction and infection. Indeed, the threat of H1N1 has remained just that, a threat, much like Avian Flu and West Nile virus.
This wealth of information has also led to a variety of new types of conceptual mapping, ones that are showing us different ways of looking at information within public environments. But, more importantly, the scale of both the information and new technology has become inverted in two significant ways.
First, the type of technology used to map and process this information has become smaller. Whereas in the past mapping applications were vast, unwieldy and proprietary, more likely used for large scale industrial and commercial enterprises, today relatively inexpensive and powerful computers, advanced telephony, digital compasses, and optical camera sensors have made these applications readily available for personal use. The second inversion involves the information itself.
The simplicity and functionality of new smartphone technology has paved the way for an explosion of social input. Not only are governmental and commercial entities adding to this growing wealth of data, more fundamentally, the gathering of information has been opened up to anyone with the interest and the initiative. In the process, the rough and tumble of daily life has become far more richer and granularly elaborate.
Perhaps the best example of this is the usefulness of an application like Google Maps, where users are able to input a desired location and receive the distance, time, as well as the best possible routes for future trips. That convenience, once only the preserve of desktop computers, has migrated onto smaller, handheld technologies. Even then, getting directions is one thing. People have begun to use their cell phones and smartphones to draw from the layers of relevant and oftentimes interesting information once they reach their destination, opening up an opportunity for a whole new immersive experience.
A number of new smartphone applications are utilizing a software platform called Augmented Reality to enrich the users experience with public environments. Augmented Reality is the idea that layers of useful information can be embedded in and added to the public spaces we encounter everyday. Imagine placing the camera of your phone down a busy street in a downtown core and being able to receive a visual output of all of the available information. There could be a sale down the street at a bookstore you’ve never heard of, or a talk by the chairman of Infotronic on the third floor of that building on your right. Imagine getting a notification that part of your 401k or retirement savings were tied up in Infotronic. Suddenly, the environment has much more salience to you than you had ever expected. Instead of seeking out this information, a much more labor-intensive task, it may already be hanging in clouds of data all around you, ready to be discovered.
Layar, touted as the world’s first Augmented Reality browser, is a new software application that attempts to integrate any available commercial or public information with geographic tags in your proximity. In effect, users are given an encyclopedic window onto the contours of their landscape through their mobile phones.
Currently, Layar is available through Android, Google’s mobile operating platform, and the iPhone. Although Apple’s mobile operating system boasts a wide variety of applications, Android’s open-source model has been much quicker at adapting the contributions of developers and users into expanding the functionality of various mobile applications like AR. Juniper Research recently reported that by 2014 services supported by AR could total as much as $732 million in revenue, establishing much of their infrastructure from overlapping databases of crowd-sourced and geo-tagged information.
A recent New York Times article described the growing trend of harnessing crowd-sourced information in developing mapping applications. One non-profit, OpenStreetMap, organizes groups of volunteers in different cities throughout the world to help collect geographic data, slowly filling up informational gaps where they may exist. Much like Wikipedia, OpenStreetMap’s goal is to make maps freer and more accessible to the general public, without the restrictive licenses that many commercial firms apply to their own proprietary maps.
Always at the vanguard of tapping into the potential for social production, Google developed Map Maker in 2008 as a collaborative tool that allows users to construct their own regional maps. By layering this crowd-sourced information onto their existing databases, Google is scaling large amounts of information in a largely decentralized though shrewdly managed process.
Even as some express concerns about the reliability and quality of user generated maps, technology writer and blogger Tim Lee suggests that, “like any disruptive technology, the initial use of these maps probably won't be ones that directly compete with incumbents.” Mr. Lee’s writing has largely been concerned with the inflexibility of large, hierarchical organizations and their management styles. He regards Silicon Valley as a signal example for the way in which information should move within institutions, where ideas and information are openly shared and collaborative production is valued as a practice and not merely as a slogan.
Mr. Lee went on to add, “to be successful, these mapping products will need to find niches that the incumbents aren't serving.” For organizations with flexible management styles, there is an opportunity for the types of bottom-up innovation that large commercial firms are too slow-footed to capitalize on, since in many cases crowed-sourced maps are much quicker at giving real time information.
OpenStreetMap 2008: A Year of Edits from ItoWorld on Vimeo.
Still, the notion that only commercially useful information enriches social life is a thin conception of citizenship. Another emerging feature a number of these new mapping applications is facilitating is a renewed sense of civic spiritedness. With the ability to access a wide variety of public information, calls for transparency and accountability are starting to ring more clearly, which in turn is altering the dynamic between public representatives and the citizens they serve.
Recently, Sunlight Labs, a Washington based non-profit aiming to digitize government data, developed an augmented reality enabled software application that gives interested citizens and public advocacy groups access to the location of government infrastructure disbursements. The application is currently being used to track the progress of the $787 billion dollar stimulus bill in the US.
Earlier this month Sunlight Labs saw a few of its recommendations included into the White House’s Open Government Directive, an administration initiative that will require executive departments and agencies to adhere to new principles of transparency, participation, and collaboration. Departments and agencies will now have to make a wide variety of their documents publicly available online and easy to search through. This may be one way, perhaps, of reducing the difficulty of navigating through complex public information, one of the major causes of public apathy.
Yet even when more transparency forces public officials to fully account for their activities, there will inevitably be areas of civic concern that large bureaucracies won’t be able to efficiently manage. And where public or private institutions are either unable or unwilling to effectively intervene, the pervasiveness of group forming technologies and applications are allowing citizens to intervene on their own behalf.
Clay Shirky, new media thinker and author of Here Comes Everybody: The Power of Organizing Without Organizations, has described how the access to inexpensive technology has had a fundamental effect on social behavior. Shriky sees sharing and collaboration as key ingredients in resolving the perennially difficult problem of broad based collective action, particularly when it arises outside of traditional institutions and organizational structures. In talks, Shirky has given crime wikis as an example of how group-forming civic initiatives can, if only provisionally, deal with collective action problems.
When Brazilian police were unwilling to publish complete geographic crime data in some high-crime areas, leading to the under-reporting of crimes by victims, Professor Vasco Furtado of the University of Fortaleza created WikiCrimes, a geographic database where victims can anonymously post the location and description of a crime they suffered. One of the challenges WikiCrimes faces, however, is the credibility of user reports.
In email, Professor Furtado said Wikicrimes tries to ensure the reliability of its information by analyzing the reputation of its user through their interactions with other users. Even though a number of press organizations and governmental agencies have been recognized as respected certifiers, Furtado acknowledges that “the attribution of reputation to the users who are not qualified as certifier entities is fundamental.” He believes that this makes the system as a whole more reliable.
Today, WikiCrimes has over 6,000 users, %20 percent of which have registered crimes. Its existence, Furtado noted, has forced some state governors to start publishing crime data online. And while law enforcement is still reluctant to work with WikiCrimes, Furtado says that the public debate over the application’s usefulness is a big cultural change. He hopes that in the future law enforcement will be able to see Wikicrimes as a complimentary tool in helping to fight crime.
French historian Alexis de Tocqueville once observed that “knowledge of how to combine is the mother of all other forms of knowledge; on its progress depends that of all the others.” In a culture where the integration of diverse types of information is leading to innovate software platforms and inventive social and commercial applications, the ability to combine existing forms of public knowledge is a crucial component for social and material prosperity.
The wisdom of crowds and the robustness of new collaborative social networks have allowed citizens to engage in broader civic and commercial initiatives, helping to stitch together the patchwork of rich and complex databases all around the world. What used to be asymmetries of information, where the general public had very little access to sources of public knowledge, have now become wider distributions of collective intelligence. Many hands have no doubt made the work lighter.
Many of these positives aspects, though, have unintended consequences. One pernicious effect is the yawning privacy and security gaps that now push the onus of protection onto each user. People are being forced to be more cautious with the types of information they volunteer and savvier in the ways they manage their online personas. Another troubling effect is the ability of large organizations to monetize the free work of eager contributors, a notion some are beginning to refer to as the "Hobby Economy."
The challenge for the citizens, policy-makers and private and public institutions will be to adapt their rules and laws to address these issues. But the broader implications are promising: technologies and applications that have ability to scale the complexity of an astonishing amount of social data, and the potential for users to optimize on-the-spot decision-making and learning.
In essence, geographic data is more than just about a particular place, it is about the character and richness of the narrative that individuals invest in that place. Clusters of local knowledge are now being augmented by the idiosyncrasies of groups of citizens, adding new layers of context, from video, text, audio, and images, on top of old ones. Perhaps we’ve entered an age where search as a platform has lead to discovery as a window, and that by looking we can know not only where we are, but also what we want to do next.
For the next hour we drove around aimlessly. I could sense that my girlfriend was burning with embarrassment and anger. Very little was said. When we finally decided to drive back to my place, I was expecting the inevitable questions. Didn’t you have any idea where Marine Land was? Do you know how to read a map? At the time, my competence was definitely open to scrutiny. But somehow, none of these questions arose.
That afternoon we ate lunch in silence. My girlfriend’s father finally broke the tension by saying he thought Marine Land was ridiculous, and that it was better that we never made it there at all. I appreciated the gesture. That evening we worked on another puzzle together and joked about how silly the day had been, how tense and irritable we all were.
That night before I went sleep I looked over the map. I closed my eyes and tried to think of were we could have gone wrong. I put the map down and looked up at the celling. It occurred to me, finally, that we could have just asked someone for directions. That would have been much easier.
Monday, November 09, 2009
The Fix
Embedded in last night’s season finale of Mad Man are two parallel solutions, one commercial, and the other artistic.
Much has been made about the price tag of each meticulously crafted episode. At over $2 million a show Mad Men cost a staggering amount to produce, and naturally, the money men at AMC have let slip that a fourth season may be too expensive for the network to green-light. Matt Weiner, the show’s creator, hinted as much in the press pool backstage at this year's Emmys shortly after accepting the award for best drama. So the question is, will Mad Men survive?
The answer might be yes. It turns out that the staff at Sterling Cooper, the ad agency the show revolves around, is getting smaller. Last night, through a series of cloak and dagger machinations, Roger Sterling, Don Draper and Bert Cooper jumped ship to start their own agency. The twists were comic and quick moving. Peggy Olson and Pete Campbell, initially skeptical, were briskly wooed along, shortly followed by Harry Crane and Joan Harris. Setting up shop in a hotel, the newer, leaner agency provides the commercial solution that might save Mad Men.
Against the backdrop of this corporate tumult, Matt Weiner was somehow able to exceed a few expectations. Last week’s blockbuster episode was perhaps the saddest Mad Men ever. Not only was it the assassination of JFK, the Draper marriage also suffered a mortal blow. The obvious metaphor of the end of Camelot is invoked when we learn in the same episode that Betty no longer loves Don. After an episode so bleak, a let down was in order.
Last night’s episode, however, looked forward. Things were laid bear, and the bitterness and distancing between Don and Betty was given a more immediate and physical aspect. But the feeling was on of motion. While much of this season has been a slow walk through mannered domesticity and small, interior lapses, the season finale was a fast jog toward a transition, evoking the melancholy of an ending and the nervous rush of a new beginning. This is the artistic solution that gives season four of Mad Men newer possibilities. It is certainly an achievement to get two parallel lines to meet, and Matt Weiner may have done that last night.
At the end of the episode we see Don Draper pulling his suitcases out of the trunk of a cab and making his way into what is likely his new home. The music that plays as the camera pulls away and the screen fades to black captures that thrilling uncertainty and wonder of a new start.
Much has been made about the price tag of each meticulously crafted episode. At over $2 million a show Mad Men cost a staggering amount to produce, and naturally, the money men at AMC have let slip that a fourth season may be too expensive for the network to green-light. Matt Weiner, the show’s creator, hinted as much in the press pool backstage at this year's Emmys shortly after accepting the award for best drama. So the question is, will Mad Men survive?
The answer might be yes. It turns out that the staff at Sterling Cooper, the ad agency the show revolves around, is getting smaller. Last night, through a series of cloak and dagger machinations, Roger Sterling, Don Draper and Bert Cooper jumped ship to start their own agency. The twists were comic and quick moving. Peggy Olson and Pete Campbell, initially skeptical, were briskly wooed along, shortly followed by Harry Crane and Joan Harris. Setting up shop in a hotel, the newer, leaner agency provides the commercial solution that might save Mad Men.
Against the backdrop of this corporate tumult, Matt Weiner was somehow able to exceed a few expectations. Last week’s blockbuster episode was perhaps the saddest Mad Men ever. Not only was it the assassination of JFK, the Draper marriage also suffered a mortal blow. The obvious metaphor of the end of Camelot is invoked when we learn in the same episode that Betty no longer loves Don. After an episode so bleak, a let down was in order.
Last night’s episode, however, looked forward. Things were laid bear, and the bitterness and distancing between Don and Betty was given a more immediate and physical aspect. But the feeling was on of motion. While much of this season has been a slow walk through mannered domesticity and small, interior lapses, the season finale was a fast jog toward a transition, evoking the melancholy of an ending and the nervous rush of a new beginning. This is the artistic solution that gives season four of Mad Men newer possibilities. It is certainly an achievement to get two parallel lines to meet, and Matt Weiner may have done that last night.
At the end of the episode we see Don Draper pulling his suitcases out of the trunk of a cab and making his way into what is likely his new home. The music that plays as the camera pulls away and the screen fades to black captures that thrilling uncertainty and wonder of a new start.
Tuesday, August 25, 2009
In Other Magazines
Slate has broken my heart. They’ve decided to discontinue the “In Other Magazines” column.When the week came to a close “In Other Magazines” cobbled together a list of the most interesting feature stories “in other magazines”, directing readers to must-reads and steering them clear of must-misses. Insofar as it was a taste-making and agenda-setting enterprise, it was a valuable service.
Over the weekend I would settle on two or three long-form articles to get a broader perspective on a particular social debate. Perhaps the most illuminate piece of writing I came across this summer was Atul Gawande’s New Yorker article on the regional cost disparities of medical procedures in the state of Texas.
Comparing two demographically similar towns only 800-miles apart, Gawande found that the pre-capita cost of Medicare for McAllen, at roughly $15,000, was close to double that of El Paso, at $7,504. The conclusion Gawande draws is that culture matters. Medical professionals in McAllen had gotten into the habit of looking at the health care system as service industry, collecting fees for more and more procedures. Patient care was an afterthought.
For a time during the summer Gawande’s piece became the go to primer on the dysfunction of the United States health care system. Even president Obama, just as the house and senate were preparing to draft various pieces of health care legislation, implored members of congress to read the article. It’s likely that didn’t. But if they wanted to they could have happened upon it “In Other Magazines”. Alas.
But in its demise Slate has inaugurated “The Slatest”, the news aggregator to rival all other news aggregators. Collecting the top stories from the top newspapers, The Slatest will at the very least offer breadth. Most interestingly, though, is what Slate editor David Plotz has to say about the “New Cycle” as it exists today:
This is crucial. The Slatest will attempt to capture this process and document what the “New Cycle” has become, a struggle by public interest groups to frame the social, political, and economic narratives of the day -- journalist and media organizations, in turn, synthesizing and refereeing these struggles. I hope amongst all of the news and events Slate doesn’t forget what’s going on in other magazines. They have important things to say too.
Over the weekend I would settle on two or three long-form articles to get a broader perspective on a particular social debate. Perhaps the most illuminate piece of writing I came across this summer was Atul Gawande’s New Yorker article on the regional cost disparities of medical procedures in the state of Texas.
Comparing two demographically similar towns only 800-miles apart, Gawande found that the pre-capita cost of Medicare for McAllen, at roughly $15,000, was close to double that of El Paso, at $7,504. The conclusion Gawande draws is that culture matters. Medical professionals in McAllen had gotten into the habit of looking at the health care system as service industry, collecting fees for more and more procedures. Patient care was an afterthought.
For a time during the summer Gawande’s piece became the go to primer on the dysfunction of the United States health care system. Even president Obama, just as the house and senate were preparing to draft various pieces of health care legislation, implored members of congress to read the article. It’s likely that didn’t. But if they wanted to they could have happened upon it “In Other Magazines”. Alas.
But in its demise Slate has inaugurated “The Slatest”, the news aggregator to rival all other news aggregators. Collecting the top stories from the top newspapers, The Slatest will at the very least offer breadth. Most interestingly, though, is what Slate editor David Plotz has to say about the “New Cycle” as it exists today:
Overnight, newspapers launch the news. They publish stories clarifying the events of yesterday; they break their own investigative stories; they print zeitgeist-defining feature articles and op-eds. The morning brings Phase 2, when Web media reacts to the news. Bloggers and other sites respond to the news that broke overnight, and newsmakers push back against or try to exploit these stories. Phase 3, the buildup, comes in the afternoon, as the events of the day unfold—congressional action, a presidential gaffe, turmoil in Asia. The media break this news, and analyze how it fits together with yesterday's top stories. Opinion makers try to shape how the day's events will play on the night's cable shows and in tomorrow's newspapers. The next morning, it all starts over again.
This is crucial. The Slatest will attempt to capture this process and document what the “New Cycle” has become, a struggle by public interest groups to frame the social, political, and economic narratives of the day -- journalist and media organizations, in turn, synthesizing and refereeing these struggles. I hope amongst all of the news and events Slate doesn’t forget what’s going on in other magazines. They have important things to say too.
Labels:
Media
Wednesday, August 19, 2009
Notes on "Free Markets"
I
The “Free Market” isn’t really real. It never was. Even venerable classically liberal economists like David Ricardo and Adam Smith understood this proposition. Markets exist as arenas of exchange that facilitate the trade of goods and services between financiers, producers and consumers. Each one of these agents is a real live human being, or a legal fiction like the corporation, embedded into a number of different social institutions which, if we pay any attention to simple social organization, are compelled to act with a minimum level of cooperation and follow a rudimentary set of norms and rules that both condition behavior and require specific levels of good faith performance. The Laws of court are perhaps the best institutional example.
The state guarantees the enforcement of contracts and the protection of property rights. The fancifully abstracted notion that governments get in the way of contract or property disputes is to a certain extent misguided, often overlooking the alternative scenario of low-level social disruption or outright public disorder. Markets require the trust of civic legal institutions, tacitly abide by their rules, and prosper to the extent that they are seen as socially legitimate. Which, with respect to certain activities on the margins, for example the sale of marijuana, is always an ongoing and perennially contested soci-political debate. If you live in the real world, social institutions that regulate market activities are an inescapable reality.
It is also widely recognized that while markets operate within the context of these social institutions they are also continually shaping their development. When we look at the growth of the post-war automobile industry we see that urban and suburban planning mirrored the growing need for automobile use, and that the proliferation of single-family homes created a new market for a variety of different household consumer goods, from refrigerators to televisions to bed frames to coffee makers to lamps. The “Free Market” was responding to trends in social development insofar it was also determining them, and with the advent of the age of advertising the market began to shrewdly cater to an effervescence of a particularly new form of mass, modern culture.
But inside the complicated machine of modern society, specialized commercial industries, particularly in aerospace, medical and pharmaceutical sciences and agriculture, among others, worked within the ambit of government regulatory agencies, agencies that set the guidelines of market interaction and exchange. Quite naturally, though, these industries have begun to capture these regulatory agencies given their own propriety and specialized expertise. Oversight became a form of volunteerism where professional and regulatory bodies crafted, most often self-interestedly, the rules of the game.
Many of today’s market inefficiencies exist because markets aren’t free. Large regulatory agencies like the Canadian Medical Association in Canada are in positions of informational asymmetry, to which civil bureaucracies over-rely on their expertise and positional status, and come to accept their recommendations as sacrosanct. An alternative form of health service like naturopathy has been slowly but steadily expanding into the health care market as it has continued to gained social respectability. This, no doubt, has taken a considerable amount of social and political advocacy, along with a substantial amount of capital. “Free Markets” have all types of access costs, and we’d be fooling ourselves if we thought otherwise.
II
Fredrich Hayek’s insight that collectively planned economies lack the informational capacities to anticipate the prices and consumer trends in an incredibly unwieldy patchwork of decentralized markets is likely the strongest argument against most forms of command socialism. Distributing and allocating resources is something market capitalism has shown tremendous prowess in undertaking. But we’re not shadowboxing with the ghost of Lenin. If you live in a western industrialized country, you live with a system of rules that act as a bulwark toward your personal wellbeing.
Hayek understood the necessity of the rule of law, and sets of social and civic institutions that established the normative precepts for social behavior and trust. At a very crude level the state is a collection of the civil institutions, families, religious, scientific, cultural associations, markets, and municipal bodies that normalize and sanction contemporary conduct. These are all manifestation of “A Culture” and express themselves in all our social and linguistic interactions and exchanges. Markets do not exist in a vacuum outside of them.
III
If we accept that the market is a carefully staged competition that operates under the rules that its participants, to varying degrees, attempt to follow, then it’s easy to see why some misunderstand its essential nature. For some on the Left, competition is the inherent problem. Even while some aspire to equalize opportunity, the objective goal is to equalize outcomes. This would be incredibly self-defeating, given the diversity of not only human capacities but also the diversity in human aspirations. There isn’t only one market, there are many; and there isn’t only one status game, there are an innumerable amount of them.
For some on the Right, however, there is an almost ahistorical myopia of the reality of first-order injustices, those that have persisted over generations. The “Free Market” favors the most socially and physically capable, the thinking goes. Because racism and sexism don’t exist today, their long shadows aren’t being cast onto contemporary debates about race and gender. Equality of opportunity, it follows, equals not only equality of access -- and some might even suggest over-access – it should mean equality of outcomes. And since empirically this isn’t always the case, a far more pernicious claim against genetic inability is tacitly entertained. It’s as though the average male height of 5’6 in 1900 had increased to 5’10 in 2003 in a social, historical vacuum.
IV
Ultimately, the competitions that the market stages are neither free nor entirely efficient. They pit together the monopolies, statutorily regulated agencies, the public and private interest groups, me and you. They represent the largest competition of all: that contest to democratically seize the state through argument, debate, enticement, persuasion, and now more unsavorily, astounding sums of capital. They represents, quiet simply, the essence of politics.
The “Free Market” isn’t really real. It never was. Even venerable classically liberal economists like David Ricardo and Adam Smith understood this proposition. Markets exist as arenas of exchange that facilitate the trade of goods and services between financiers, producers and consumers. Each one of these agents is a real live human being, or a legal fiction like the corporation, embedded into a number of different social institutions which, if we pay any attention to simple social organization, are compelled to act with a minimum level of cooperation and follow a rudimentary set of norms and rules that both condition behavior and require specific levels of good faith performance. The Laws of court are perhaps the best institutional example.
The state guarantees the enforcement of contracts and the protection of property rights. The fancifully abstracted notion that governments get in the way of contract or property disputes is to a certain extent misguided, often overlooking the alternative scenario of low-level social disruption or outright public disorder. Markets require the trust of civic legal institutions, tacitly abide by their rules, and prosper to the extent that they are seen as socially legitimate. Which, with respect to certain activities on the margins, for example the sale of marijuana, is always an ongoing and perennially contested soci-political debate. If you live in the real world, social institutions that regulate market activities are an inescapable reality.
It is also widely recognized that while markets operate within the context of these social institutions they are also continually shaping their development. When we look at the growth of the post-war automobile industry we see that urban and suburban planning mirrored the growing need for automobile use, and that the proliferation of single-family homes created a new market for a variety of different household consumer goods, from refrigerators to televisions to bed frames to coffee makers to lamps. The “Free Market” was responding to trends in social development insofar it was also determining them, and with the advent of the age of advertising the market began to shrewdly cater to an effervescence of a particularly new form of mass, modern culture.
But inside the complicated machine of modern society, specialized commercial industries, particularly in aerospace, medical and pharmaceutical sciences and agriculture, among others, worked within the ambit of government regulatory agencies, agencies that set the guidelines of market interaction and exchange. Quite naturally, though, these industries have begun to capture these regulatory agencies given their own propriety and specialized expertise. Oversight became a form of volunteerism where professional and regulatory bodies crafted, most often self-interestedly, the rules of the game.
Many of today’s market inefficiencies exist because markets aren’t free. Large regulatory agencies like the Canadian Medical Association in Canada are in positions of informational asymmetry, to which civil bureaucracies over-rely on their expertise and positional status, and come to accept their recommendations as sacrosanct. An alternative form of health service like naturopathy has been slowly but steadily expanding into the health care market as it has continued to gained social respectability. This, no doubt, has taken a considerable amount of social and political advocacy, along with a substantial amount of capital. “Free Markets” have all types of access costs, and we’d be fooling ourselves if we thought otherwise.
II
Fredrich Hayek’s insight that collectively planned economies lack the informational capacities to anticipate the prices and consumer trends in an incredibly unwieldy patchwork of decentralized markets is likely the strongest argument against most forms of command socialism. Distributing and allocating resources is something market capitalism has shown tremendous prowess in undertaking. But we’re not shadowboxing with the ghost of Lenin. If you live in a western industrialized country, you live with a system of rules that act as a bulwark toward your personal wellbeing.
Hayek understood the necessity of the rule of law, and sets of social and civic institutions that established the normative precepts for social behavior and trust. At a very crude level the state is a collection of the civil institutions, families, religious, scientific, cultural associations, markets, and municipal bodies that normalize and sanction contemporary conduct. These are all manifestation of “A Culture” and express themselves in all our social and linguistic interactions and exchanges. Markets do not exist in a vacuum outside of them.
III
If we accept that the market is a carefully staged competition that operates under the rules that its participants, to varying degrees, attempt to follow, then it’s easy to see why some misunderstand its essential nature. For some on the Left, competition is the inherent problem. Even while some aspire to equalize opportunity, the objective goal is to equalize outcomes. This would be incredibly self-defeating, given the diversity of not only human capacities but also the diversity in human aspirations. There isn’t only one market, there are many; and there isn’t only one status game, there are an innumerable amount of them.
For some on the Right, however, there is an almost ahistorical myopia of the reality of first-order injustices, those that have persisted over generations. The “Free Market” favors the most socially and physically capable, the thinking goes. Because racism and sexism don’t exist today, their long shadows aren’t being cast onto contemporary debates about race and gender. Equality of opportunity, it follows, equals not only equality of access -- and some might even suggest over-access – it should mean equality of outcomes. And since empirically this isn’t always the case, a far more pernicious claim against genetic inability is tacitly entertained. It’s as though the average male height of 5’6 in 1900 had increased to 5’10 in 2003 in a social, historical vacuum.
IV
Ultimately, the competitions that the market stages are neither free nor entirely efficient. They pit together the monopolies, statutorily regulated agencies, the public and private interest groups, me and you. They represent the largest competition of all: that contest to democratically seize the state through argument, debate, enticement, persuasion, and now more unsavorily, astounding sums of capital. They represents, quiet simply, the essence of politics.
Tuesday, June 02, 2009
The Problem with Complexity
Thomas Homer-Dixon's 2006 book The Upside of Down: Catastrophe, Creativity and the Renewal of Civilization was a sobering diagnosis of the incredible challenges that modern societies face. His general thesis was that complex societies ultimately have trouble dealing with ever greater degrees of complexity, and that these stresses tend to increase social, political, and economic disruption. Homer-Dixon identifies four different stresses:
- population, both in growth rates among the rich and the poor and the incredible growth of megacities in poor countries
- energy, especially the growing scarcity of conventional oil
- environmental, particularly climate change and global warming
- economic, disparities in income between developed and developing countries as well as income inequality within countries
Indeed none of these stresses are mutually exclusive, which is the major concern among a great many politicians, policymakers and citizens. But perhaps the most important part of the equation is energy. To lay out his argument Homer-Dixon uses a powerful historical example. The Roman Empire required an incredible amount of "high-quality" energy inputs, primarily in the form of grain, as the basis for its administrative and industrial integrity. As the amount of complexity to organize their social, political and economic systems rose, so to did the "high-quality energy" inputs needed to run these systems. Once the quality of energy inputs decreased, the complexity of Roman institutional systems began an irreversible decline.
What interests me most about Homer-Dixon's thesis is his notion of complexity. When confronted with with multiple crises, as we presently face today, societies have generally tended to create newer layers of complexity. In the political and policy responses to the Great Depression, the New Deal essentially offered a series of technical remedies to deal with the complexities of a phenomenally powerful yet socially pernicious market-capitalism. The Civil Rights Act is another example, although it was obviously responding to other types of societal and political stresses. Still, new levels of bureaucracy seem to be the most natural response to existential crises, and legislatively enacted distributions, whether they be physical goods or procedural recognitions of status claims, become the ultimate remedy.
An incredible market failure has socialized risk, leaving governments around the world the thankless task of reorganizing a whole array of institutions. How they manage this reorganization is an issue riddled with political, social, economic and philosophical dimensions. Dimensions on top of dimensions; dimensions embedded in dimensions. If complexity is the problem, how is it possible to deal with our present challenges? I hope to explore this question further.
Tuesday, August 12, 2008
No Hope
IT’S DIFFICULT to quantify exactly how devastating a loss for the Democrats in the November presidential elections would be for me. As a Canadian ex-pat living and working in Japan I’m clearly not in a position to be directly impacted by their electoral misfortunes. As a “Citizen of the world”, the consequences are somewhat obvious; a continuing decline in American prestige and influence; the rise of China and Russia as contenders to world hegemony. But for my physical health, a loss for the Democrats in November won’t be good.
On the night of the 2004 presidential election, I arrived home from work eager to follow the results as they came in. I knew from the experience of the “2000 debacle” that it could end up being a long night. But as I sat down to follow the news a feeling that had been foreshadowing itself all day began to overwhelm me.
A pall of cloudiness moved into my head like a cold front. I could feel building waves of nausea rocking in my stomach, a physical weaknesses entering my joints. It was all I could do to get to bed. I remember not being able to think straight, but wanting to; wanting to for the sake of the election. I remember not caring that John Kerry wasn’t even remotely presidential, as I had all through the primaries and during the presidential debates. Oh well. I closed my eyes and saw Bush’s face, a sly smile, then lost consciousness.
I’ve attempted to reconstruct exactly what happened that night to friends and family, yet somehow it always ends up sounding fanciful. That I could fall into a delirious fever dream the night of the 2004 elections, a dream in which I entered the future lives of my friends, coasting from one continent to another, seemed a little absurd. But that’s exactly what happened.
IN A VIVID SCENE I was following a close friend down a pristine sidewalk, a lovely girl on his arm. It occurred to me that this was his wife, and that he looked a little less younger and more filled out. I recall troubling over a cognitive dilemma: I felt neither fully asleep nor fully awake. Perhaps this is what purgatory felt like. As if in a waking dream, I started reaching out and calling for his attention. This didn’t seem to be working and suddenly I was mesmerized by a row of potted plants in front of a cafĂ© with a black tarp awning. The next moment I fell through the sidewalk into empty, black space, unintelligible physical forces working against my body. I stood somewhere else now.
Another friend of mine was behind a large office desk and behind him a beautiful city skyline twinkled at dusk. His eyes were red from what appeared to be over work. I recall standing in front of him thinking he was surprised to see me there. A brusque, powerful woman walked into the room and began speaking but I couldn’t understand what she was saying. It sounded like English, only harsher and more aggressive. German maybe. She didn’t seem upset; she seemed controlled and strangely fluid. And it struck me as odd that she had no idea I was in the room, standing right beside her. I could see the skyline starting to disappear. Empty black space filled the room, and gradually the only thing I could feel were beads of sweat sliding down my face. I felt ill from the motion. And for the rest of the night I felt like I had traveled many places, always with the feeling that it was a possible future.
When I woke up in the morning I was too sick to go to work, so I went back to sleep. The next time I woke up I realized that something may be hanging in the balance, so I turned on my computer and checked the news and George W. Bush had won. I went back to bed again, though this time it felt like I didn’t wake up, and I was sick for a week.
Looking Backward and contextualizing that sickness always felt opportunistic and convenient in light of the last four years of what most would rightly characterizes as a global sickness. You cannot be hyperbolic in saying that the United States has been driven into the ground by an incredible failure of leadership. And not only that: Conservatism as a governing ideology has exhausted, at least for now, its usefulness. A political philosophy that is hostile to government in almost every respect will destroy government, from without and within.
AMERICAN EXCEPTIONALISM TELLS US that America is essentially a force for good in the world, that a unique social and political experiment created a prosperous and democratic republic. The recent historical evidence doesn’t bear this claim out, and America’s legacy of slavery shatters this claim to pieces entirely. But perfection isn’t in the standing still, it is in the striving forward. Successive generations of social and political movements have pushed America closer to its ideals, that every man is created equally and endowed by their creator to the pursuit of life, liberty, and happiness. The problem with American exceptionalism, though, and all credos of Nationalism is their insularity. In difficult times they will always drawn inward and justify their relevance.
International opprobrium and criticism leveled against the United States in the last four years has grown so loud and persistent as to be meaningless. Yet the more grounded criticisms stick: that America has turned away from multilateralism and bullied allies into submission; that they prop up an autocrat in one instance and topple him in another; that they double deal and operate on the fringes of internationally recognized law; that they are, even worse, arrogant. This is not a new development in American statecraft. It has always been a feature in the landscape. What has made it appear more acute in recent times is the outright bellicosity of the Bush administration, provoked rightly or wrongly by the terrorist attacks of 9/11. It is probably fitting that George W. Bush will be remembered as both the apex and the nadir of American exceptionalism.
THERE IS ONE SCHOOL of thought in American foreign policy that believes in national greatness and an unending battle against evil around the world. The soft power of diplomacy and the strategy of containment won’t satiate this belief. Militarism and the continual threat of hard power animate this worldview. In a 1997 article for Reason Magazine looking at the disconsolate Conservative movement, Virginia Postrel and James K. Glassman mocked Conservatives complaints about the current state of the country. The economy was booming, consumer confidence was high, Republicans controlled both houses of Congress, and even despite their vehement disdain for Bill Clinton, Conservatives had already secured a number of legislative concessions from him, most notably welfare reform. Thus with the Cold War over, some Conservatives felt idle and restless. They wanted to “offer their own governing doctrine, “the appeal to American greatness” – a kind of wistful nationalism in search of a big project”, Postrel and Glassman wrote. Tellingly, Postrel and Glassman mused that this big project might entail “looking for the next war “or “hope for another great Depression”, fanciful ambitions at the time, but in hindsight ambitions with dangerous implications once Republicans would come to hold the presidency.
And now this school of thought pursues national greatness in Iraq, and its hawkish-ness colors particular foreign policy positions. They will not negotiate with Iran (but are beginning to), they will not negotiate with North Korea (and yet they have), and they would clamor for aggressive action against China and Russia (but are in no position to). There is absolutely no sense of priorities, or grand strategy. This is the height of magical thinking. Neoconservatives, along with compliant Republicans and Democrats who pushed America into Iraq, represent this worldview. John McCain, the republican presidential nominee, represents this worldview. For him, there is nothing presently wrong with American exceptionalism. The American electorate, more broadly, has affirmed this worldview for the last 40 years.
There is another school of thought in American foreign policy. It is more pragmatic and realist. It believes that the soft power of persuasion through economic and political concession will encourage countries to work to their own self-interest. It believes that international order and incremental progress will move the world to greater prosperity for all. Its flaw is that it doesn’t deal with belligerents as thoroughly as the Neoconservative worldview would have it, and thus condones territorial unrest. But there are too many fires in the world to put out. This worldview, judicious and respectful of American exceptionalism’s salutary influence, believes that multilateral institutions should act as a reasonable, though not ultimate, check on a country’s international activities, and that the military option is the final option, not the starting premise. People who adhere to this school of thought understand that America has to begin regaining the international capital it so poorly squandered in the last eight years. It must prove once again that it is humble and deserving of the world’s respect; that it is prepared to lead by example. Democratic presidential nominee Barack Obama represents this worldview.
IT IS SOMETHING OF a cosmic irony that the opportunity to change course is such a risky one for Americans. With nearly 80% of Americans believing the country is going in the wrong direction and polls showing Democrats overwhelmingly ahead of Republicans on most issues, John McCain and Barak Obama are essentially in a dead heat. Even one in four registered Democrats consider Obama a riskier choice compared to McCain, while 14% of Americans, evidence to the contrary, believe he’s a practicing Muslim.
In an interesting bit of reportage for Newsweek, Christopher Dickey journeys through Tennessee, Georgia and the Carolinas, where the deepest wounds of the Civil war still exist, tracing his ancestral linage. Reporting on Southern prejudices, Dickey finds just how strong the “Secret Muslim” suspicion is toward Obama:
Yet even a third cousin of mine in the mountains of North Carolina, an independent-minded Democrat who voted for Gore in 2000 and Bush in 2004, said he can't bring himself to vote for Obama, either. Why? "Because I believe he is a Muslim," said my cousin. Not so, I said. He was raised a Christian and is a practicing Christian. My cousin shook his head. "I just don't believe him," he said.
But here is the irony on its face: He’s Black. His first name rhymes with Iraq. His middle name is Hussein. By changing one letter in his last name you can spell Osama. That is cosmic irony. Obama’s candidacy re-imagines the last eight years as a big, practical joke, and faced with the decision to change course, Americans are being tempted to give in to all of their petty and narrow prejudices. Actual political and policy differences aside, Obama should be up in the polls by at least 15%. Any generic Democrat, say, a John Edwards, would be already drafting his or her inaugural speech. But John Edwards is no longer a generic Democrat, and if the recent revelations of his extramarital affair are any indication, Democrats should be relieved that he isn’t their nominee.
Yet the knock against Barack Hussein Obama is that he looks too presidential; that he only gives great speeches and nothing else; that he draws incredible crowds as a result of his novelty, charisma and charm; that he generates hope and enthusiasm in those without it and fear and anxiety to those who need it. People also say he’s too skinny. Rationally, none of this makes any sense. But to the extent that non-whiteness, or non-maleness, represents an electoral anomaly, and that stated and hidden preferences statistically vary, Barack Hussein Obama may lose the election by a large margin.
THE STANDARD for leadership has been lowered so drastically when verbal eloquence is looked upon suspiciously; when international adulation is greeted as something sinister; when the ability to inspire hope is laughed at derisively. It’s apparent that a large, and growing, minority of Americans don’t want Obama as their president. Never missing an opportunity to miss an opportunity, perhaps Americans don’t deserve a president that potentially promising. The argument that the presidency is only reserved for old, white males is made even stronger by the candidacy of John McCain, who may become the oldest, whitest president of the United State of America.
Subscribe to:
Posts (Atom)