things I am appreciating today

From David Weinberger’s “Copyright’s Creative Disincentive”:

It takes culture. It takes culture to build culture.

Whether it’s Walt Disney recycling the Brothers Grimm, Stephen King doing variations on a theme of Bram Stoker, or James Joyce mashing Homer up with, well, everything, there’s no innovation that isn’t a reworking of what’s already there. An innovative work without cultural roots would be literally unintelligible. So, incentives that require overly-strict restrictions on our use of cultural works directly diminish the innovativeness of that culture.

The facts are in front of us, in overwhelming abundance. The signature works of our new age are direct slaps in the face of our old assumptions about incentives. Wikipedia was created by unpaid volunteers, some of whom put in so much time that their marriages suffer. Flickr has more beautiful photos than you could look at it in a lifetime. Every sixty seconds, people upload twenty hours (72,000 seconds) of video to YouTube — the equivalent of 86,000 full-length Hollywood movies being released every week. For free. The entire Bible has been translated into LOLcat (“Oh hai. In teh beginnin Ceiling Cat maded teh skiez An da Urfs, but he did not eated dem.”) by anonymous, unpaid contributors, and while that might not be your cup of tea — it is mine — it is without dispute a remarkably creative undertaking.

From Amanda French’s “Imagine a National Digital Library: I Wonder If We Can”:

…the Korean dibrary [digital library] is not just about fancy physical spaces or symbolic cartoon characters: it’s very much about providing a whole set of national library services for Korea. In September 2009, just a few months after the dibrary first opened, Korean law was altered in order to give Korean dibrarians the authority to collect and indeed responsibility for collecting Korean data from the open web. Certain kinds of data were legally required to be deposited in the national digital library so as to enable not only preservation but also “the production and distribution of alternative materials for the disabled.” Now centrally coordinated by the National Digital Library of Korea are all kinds of digital services, from training programs to inter-library loan. The dibrary is even charged with creating a “one card system that gives access to 699 public libraries nationwide,” a system scheduled to go live in 2012. And once Korea has fully nationalized as many library materials and services as it can, it’s apparently not going to stop there: last summer a meeting was held to plan a China-Japan-Korea Digital Library, an Asian digital library or portal modeled after The European Library project. To me it sounds like the second step toward the single digital library filed contentedly away in the humming systems of the starship Enterprise, waiting to be addressed with a question: “Computer . . .”

From Joachim Buwembo’s editorial in The East African, “Uganda’s runaway vote price inflation has economists baffled”:

In 2001, instead of paying heavily for votes, you could reduce the votes of your candidate’s opponent by killing off some of his voters.

The more subtle methods used could include driving an army truck through a crowd of his supporters.

In 2006, things could get a bit more direct and you could fire a sub-machinegun into a crowd of the supporters of your candidate’s rival in broad daylight in the capital city.

But come 2011, things have become more humane and it is market forces that are determining the direction of flow of votes.

Lunch at Berkman: DDoS Attacks Against Independent Media and Human Rights Sites

Liveblogging Hal Roberts, Ethan Zuckerman and Jillian York’s presentation on Distributed Denial of Service Attacks Against Independent Media and Human Rights Sites at the Berkman Center. Please excuse misrepresentation, misinterpretation, typos and general stupidity.

*****

Hal begins by outlining the history of denial of service attacks, which “have been around as long as the Internet.” The rise of botnets allowed for distributed denial of service (DDoS) attacks, in which the attacks are coming from multiple places at the same time. Early botnets were controlled by IRC; these days, many are operated through Twitter accounts.

Ethan points out that we’re seeing a rise in botnets being used to attack each other. One of the largest Internet outages of all time — 9 hours long, in China — was caused by a botnet-fueled “turf war” between two online gaming providers.

(Interesting factoid: early DDoS defense systems grew from the needs of online gambling sites that were being attacked, who operate in a gray area and may not want to ask authorities for help defending against attacks.)

Arbor’s ATLAS, which tracks DDoS attacks worldwide, estimates that 500-1500 attacks happen per day. Hal & Ethan believe that ATLAS “only sees the big ones,” meaning the 500-1500 number is a gross underestimate.

DDoS attacks comprise a wide variety of approaches: slowloris attacks overwhelm machines by slowing down their response rates to requests, while random incessant searches require a server to repeatedly execute database calls, using up all available resources. These two examples are application attacks that essentially “crash the box” (affect a single server). Network attacks that involve volunteers, bots, and/or amplifiers work by “clogging the pipe,” or slowing down the the flow of traffic, for example by requesting huge amounts of data that flood a server.

People who face DDoS attacks have several options. One is to obtain a better machine with a higher capacity to handle requests. Another option is to rent servers online in order to add resources only when they’re needed. Packet filtering can block malicious traffic (assuming it can be identified); scrubbing involves having a data center filter packets for you. Source mitigation and dynamic rerouting are used when the network is flooded. At that point, packet filtering and scrubbing is impractical. Both tactics involve preventing that flood of traffic from arriving, whether by stopping it in its tracks or by sending it somewhere else.

All of these tactics are problematic in some way: they’re expensive (scrubbing can cost $40,000-50,000 per month), they require considerable advance planning or high-level connections, or they’re tricky to execute (the “dark arts” of DDoS defense).

“All of this is background,” Hal says. Their specific research question involves independent media and human rights sites — what kinds of DDoS attacks are used against them, and how often? How can they defend themselves?

Hal describes a “paradox” of DDoS attacks: overall, the defenses are working pretty well. Huge sites — Google, the New York Times, Facebook — are attacked often, but they manage to stay online. This is because these sites are located close to the core of the network, where around 75% of ISPs are able respond to DDoS attacks in less than an hour, making DDoS attacks a “manageable problem.” The sites at the edge of the network are much more vulnerable, and they’re also much more likely to be attacked.

Ethan describes the case of Viet Tan, which is under DDoS attacks almost constantly — to the extent that when they put up a new web service, it is attacked within hours. As a result, Viet Tan has shifted many of their new campaigns to Blogger (blogspot.com) blogs.

Viet Tan is struggling in particular because they’re not only experiencing DDoS attacks. They also face filtering at the national level, from a government who wants to prevent people in Vietnam from accessing their site. Ethan says that 81% of sites in the study that had experienced a DDoS attack have also experienced intrusion, filtering, or another form of attack. In the case of Viet Tan, the site was being attacked unknowingly by its target audience, many of whom were using a corrupted Vietnamese keyboard driver that allowed their computers to be used as part of a botnet attack.

One of the big problems for sites that are DDoS-ed is that their ISPs may jettison them in order to protect other sites on the same server. Of the sites in the study, 55% of sites that were attacked were shut down by their ISP, while only 36% were successfully defended by their ISP.

An attack against Irrawaddy, a Burmese activist site hosted in Thailand, essentially caused all of Thailand to go offline. In response, Irrawaddy’s ISP asked it to move elsewhere. This year, they were attacked again with a larger attack. They were on a stronger ISP that may have been able to protect them, but they hadn’t paid for the necessary level of protection and were again shut down.

Hal and Ethan suggest that a system of social insurance is happening online, at least with larger sites — everything is starting to cost a little bit more, with the extra cost subsidizing the sites that are attacked. The problem with this is that small Internet sites aren’t protected because they’re not in the core.

Hal and Ethan wonder whether someone should build dedicated human rights hosting to protect these sites from attacks. The problem with this is that it collects all these sites into a single location, meaning any company that hosted a group of these sites would be a major target for DDoS attacks. Devising a fair pricing system in this case is tricky.

Ethan raises the issue of intermediary censorship — the constant threat that your hosting company may shut your site down for any reason (e.g., when Amazon shut down Wikileaks). This is a problem of Internet architecture, he says, and there are two solutions: building an alternative, peer-based architecture, or creating a consumer movement that puts sufficient pressure on hosting companies not to take sites down.

What Hal and Ethan ended up recommending to these sites is to have a back-up plan; to minimize dynamic pages; to have robust mirroring, monitoring and failover; to consider hosting on Blogger or a similar large site; and to avoid using the cheapest hosting provider.

Within some communities, Ethan says, a person or group emerges that is the technical contact. This person or group advocates for sites that are under attack. These “tech leaders” are connected to one another and to companies in the core that want to help. The problem is that this isn’t a particularly scaleable model — a better chain needs to be established, so that problems can escalate through a team of local experts up to larger entities. In the meantime, it’s essential to increase organized public pressure on private companies not to act as intermediary censors, but rather to support these sites.

Tools for Transparency: Google Refine

Originally posted as a guest post on the Sunlight Foundation blog.

For the past six months, I’ve served as the co-director of the Technology for Transparency Network, an organization that documents the use of online and mobile technology to promote transparency and accountability around the world. One of the most common challenges the project leaders we’ve interviewed face is making sense of large amounts of data.

In countries where governments keep detailed digital records of lobbying data and education expenditures, data wrangling is a time-consuming, labor-intensive task. In countries where these records are poorly maintained, this task becomes even harder — everything from inconsistent data entry practices to simple typos can derail data analysis.

Google Refine (formerly Freebase Gridworks) is a free, open-source tool for cleaning up, combining, and connecting messy data sets. Rather than acting like a traditional spreadsheet program, Google Refine exists “for applying transformations over many existing cells in bulk, for the purpose of cleaning up the data, extending it with more data from other sources, and getting it to some form that other tools can consume.”

At its most basic level, Google Refine helps users quickly summarize, filter and edit data sets by allowing them to view patterns and to spot and correct errors quickly. More advanced features include reconciling data sets (i.e., matching text in the set with existing database IDs) with data repository Freebase, geocoding, and fetching additional information from the Web based on existing data.

Though it runs through an Internet browser, Google Refine operates offline, making it attractive for those with limited bandwidth or privacy concerns — a group that includes many of the projects listed on the Technology for Transparency Network.

Google Refine isn’t going to solve the problem of poor data availability, but for those who manage to gain access to existing records, it can be a powerful tool for transparency.

For more information, check out the links and video below:

SIPA Shushing Students over CableGate. Seriously?

Yesterday a friend forwarded me a link to a blog post about Wikileaks. Not surprising, given the number of Wikileaks-related blog posts that are floating around the Internet in the wake of the organization’s release of a quarter of a million U.S. Embassy cables. But this blog post was different: this blog post referenced the Columbia University School of International and Public Affairs (SIPA), from which I graduated six months ago.

The author reposts an e-mail sent from SIPA’s Office of Career Services to all current students. It reads:

From: “Office of Career Services”

Date: November 30, 2010 15:26:53 ESTTo:

Hi students,

We received a call today from a SIPA alumnus who is working at the State Department. He asked us to pass along the following information to anyone who will be applying for jobs in the federal government, since all would require a background investigation and in some instances a security clearance.

The documents released during the past few months through Wikileaks are still considered classified documents. He recommends that you DO NOT post links to these documents nor make comments on social media sites such as Facebook or through Twitter. Engaging in these activities would call into question your ability to deal with confidential information, which is part of most positions with the federal government.

Regards,
Office of Career Services

I’m currently happily employed at the Berkman Center for Internet & Society, but while I was at SIPA I seriously considered a career in the Foreign Service. I applied for (and was offered) a summer internship at the State Department, and I coordinated a conference on Policy Making in the Digital Age, at which the State Department’s Director of the Office of eDiplomacy and a representative of the Office of Innovative Engagement spoke.

I guess I can kiss that possible alternate career path goodbye, given that I tweeted a link yesterday to an article about CableGate. Seriously, State Department? This is all over the news. What’s more, it’s become a focal point for discussions on how digital technology is changing our expectations for government transparency (for those who’ve forgotten: the State Department is big on using tech to promote transparency in other countries. Just not here in the US?).

Seriously, SIPA? As fellow SIPA alum Ben Colmery pointed out in a comment on my Facebook wall, since when does having an opinion about a site leaking documents equate to actually leaking documents oneself? You claim to provide committed students with the necessary skills and perspectives to become responsible leaders. Apparently that means curtailing their academic freedom and teaching them how to bury their heads in the sand.

Crossposted on The Morningside Post

Update, December 6: The State Department is denying that it provided “advice to anyone beyond the State Department” regarding Wikileaks and claiming the information in the OCS email “does not represent a formal policy position.”

Tech for Transparency: New Interviews Posted

Avid readers of my blog (here’s looking at you, Rev) may remember that several months ago I announced that research was beginning for the second phase of the Technology for Transparency Network. The first phase consisted of interviews with over 30 projects around the world who are using technology to promote transparency and accountability in the government and/or private sector. Our goal in the second phase was twofold: to double the number of case studies on the site and to expand the geographic regions we covered.

Since then, I’ve been largely silent about the project — we’ve been working so hard to complete and edit the interviews that I haven’t had much time to breathe. But today I’m thrilled to announce that we have eight new case studies online, with lots more to come over the next few weeks. The case studies that have been posted so far are:

Accountability Initiative
Accountability Initiative researches and creates innovative tools to promote transparency and accountability in India’s public services.

Amatora mu Mahoro
Amatora mu Mahoro (“Peaceful Elections”) is an Ushahidi-based project created to monitor Burundi’s 2010 elections.

Association for Democratic Reforms
ADR India works to monitor national elections through country-wide SMS and helpline campaigns and an informational website.

Democrator.ru
Democrator.ru seeks to empower citizens by helping them collectively send petitions and inquiries to government bodies.

Excelências
Excelências fights corruption in the Brazilian government by publishing data about politicians and government activities online.

Golos
Golos (Voice) has introduced several online tools for better election monitoring in Russia.

Mam Prawo Wiedzieć
Mam Prawo Wiedzieć helps Polish citizens access information about their elected representatives in an easy, user-friendly way.

Pera Natin ‘to!
Pera Natin ‘to! (It’s Our Money!) encourages Filipino citizens to report times when they are asked for bribes.

In addition to continuing to post new case studies (you can subscribe to our case study feed via RSS), we’ll also be publishing our final report on both phases of the project by the end of the month. In the meantime, check out @techtransparent and our Facebook page for daily updates and our podcast for interviews with the project leaders!