Lunch at Berkman: DDoS Attacks Against Independent Media and Human Rights Sites

Liveblogging Hal Roberts, Ethan Zuckerman and Jillian York’s presentation on Distributed Denial of Service Attacks Against Independent Media and Human Rights Sites at the Berkman Center. Please excuse misrepresentation, misinterpretation, typos and general stupidity.

*****

Hal begins by outlining the history of denial of service attacks, which “have been around as long as the Internet.” The rise of botnets allowed for distributed denial of service (DDoS) attacks, in which the attacks are coming from multiple places at the same time. Early botnets were controlled by IRC; these days, many are operated through Twitter accounts.

Ethan points out that we’re seeing a rise in botnets being used to attack each other. One of the largest Internet outages of all time — 9 hours long, in China — was caused by a botnet-fueled “turf war” between two online gaming providers.

(Interesting factoid: early DDoS defense systems grew from the needs of online gambling sites that were being attacked, who operate in a gray area and may not want to ask authorities for help defending against attacks.)

Arbor’s ATLAS, which tracks DDoS attacks worldwide, estimates that 500-1500 attacks happen per day. Hal & Ethan believe that ATLAS “only sees the big ones,” meaning the 500-1500 number is a gross underestimate.

DDoS attacks comprise a wide variety of approaches: slowloris attacks overwhelm machines by slowing down their response rates to requests, while random incessant searches require a server to repeatedly execute database calls, using up all available resources. These two examples are application attacks that essentially “crash the box” (affect a single server). Network attacks that involve volunteers, bots, and/or amplifiers work by “clogging the pipe,” or slowing down the the flow of traffic, for example by requesting huge amounts of data that flood a server.

People who face DDoS attacks have several options. One is to obtain a better machine with a higher capacity to handle requests. Another option is to rent servers online in order to add resources only when they’re needed. Packet filtering can block malicious traffic (assuming it can be identified); scrubbing involves having a data center filter packets for you. Source mitigation and dynamic rerouting are used when the network is flooded. At that point, packet filtering and scrubbing is impractical. Both tactics involve preventing that flood of traffic from arriving, whether by stopping it in its tracks or by sending it somewhere else.

All of these tactics are problematic in some way: they’re expensive (scrubbing can cost $40,000-50,000 per month), they require considerable advance planning or high-level connections, or they’re tricky to execute (the “dark arts” of DDoS defense).

“All of this is background,” Hal says. Their specific research question involves independent media and human rights sites — what kinds of DDoS attacks are used against them, and how often? How can they defend themselves?

Hal describes a “paradox” of DDoS attacks: overall, the defenses are working pretty well. Huge sites — Google, the New York Times, Facebook — are attacked often, but they manage to stay online. This is because these sites are located close to the core of the network, where around 75% of ISPs are able respond to DDoS attacks in less than an hour, making DDoS attacks a “manageable problem.” The sites at the edge of the network are much more vulnerable, and they’re also much more likely to be attacked.

Ethan describes the case of Viet Tan, which is under DDoS attacks almost constantly — to the extent that when they put up a new web service, it is attacked within hours. As a result, Viet Tan has shifted many of their new campaigns to Blogger (blogspot.com) blogs.

Viet Tan is struggling in particular because they’re not only experiencing DDoS attacks. They also face filtering at the national level, from a government who wants to prevent people in Vietnam from accessing their site. Ethan says that 81% of sites in the study that had experienced a DDoS attack have also experienced intrusion, filtering, or another form of attack. In the case of Viet Tan, the site was being attacked unknowingly by its target audience, many of whom were using a corrupted Vietnamese keyboard driver that allowed their computers to be used as part of a botnet attack.

One of the big problems for sites that are DDoS-ed is that their ISPs may jettison them in order to protect other sites on the same server. Of the sites in the study, 55% of sites that were attacked were shut down by their ISP, while only 36% were successfully defended by their ISP.

An attack against Irrawaddy, a Burmese activist site hosted in Thailand, essentially caused all of Thailand to go offline. In response, Irrawaddy’s ISP asked it to move elsewhere. This year, they were attacked again with a larger attack. They were on a stronger ISP that may have been able to protect them, but they hadn’t paid for the necessary level of protection and were again shut down.

Hal and Ethan suggest that a system of social insurance is happening online, at least with larger sites — everything is starting to cost a little bit more, with the extra cost subsidizing the sites that are attacked. The problem with this is that small Internet sites aren’t protected because they’re not in the core.

Hal and Ethan wonder whether someone should build dedicated human rights hosting to protect these sites from attacks. The problem with this is that it collects all these sites into a single location, meaning any company that hosted a group of these sites would be a major target for DDoS attacks. Devising a fair pricing system in this case is tricky.

Ethan raises the issue of intermediary censorship — the constant threat that your hosting company may shut your site down for any reason (e.g., when Amazon shut down Wikileaks). This is a problem of Internet architecture, he says, and there are two solutions: building an alternative, peer-based architecture, or creating a consumer movement that puts sufficient pressure on hosting companies not to take sites down.

What Hal and Ethan ended up recommending to these sites is to have a back-up plan; to minimize dynamic pages; to have robust mirroring, monitoring and failover; to consider hosting on Blogger or a similar large site; and to avoid using the cheapest hosting provider.

Within some communities, Ethan says, a person or group emerges that is the technical contact. This person or group advocates for sites that are under attack. These “tech leaders” are connected to one another and to companies in the core that want to help. The problem is that this isn’t a particularly scaleable model — a better chain needs to be established, so that problems can escalate through a team of local experts up to larger entities. In the meantime, it’s essential to increase organized public pressure on private companies not to act as intermediary censors, but rather to support these sites.

Tools for Transparency: Google Refine

Originally posted as a guest post on the Sunlight Foundation blog.

For the past six months, I’ve served as the co-director of the Technology for Transparency Network, an organization that documents the use of online and mobile technology to promote transparency and accountability around the world. One of the most common challenges the project leaders we’ve interviewed face is making sense of large amounts of data.

In countries where governments keep detailed digital records of lobbying data and education expenditures, data wrangling is a time-consuming, labor-intensive task. In countries where these records are poorly maintained, this task becomes even harder — everything from inconsistent data entry practices to simple typos can derail data analysis.

Google Refine (formerly Freebase Gridworks) is a free, open-source tool for cleaning up, combining, and connecting messy data sets. Rather than acting like a traditional spreadsheet program, Google Refine exists “for applying transformations over many existing cells in bulk, for the purpose of cleaning up the data, extending it with more data from other sources, and getting it to some form that other tools can consume.”

At its most basic level, Google Refine helps users quickly summarize, filter and edit data sets by allowing them to view patterns and to spot and correct errors quickly. More advanced features include reconciling data sets (i.e., matching text in the set with existing database IDs) with data repository Freebase, geocoding, and fetching additional information from the Web based on existing data.

Though it runs through an Internet browser, Google Refine operates offline, making it attractive for those with limited bandwidth or privacy concerns — a group that includes many of the projects listed on the Technology for Transparency Network.

Google Refine isn’t going to solve the problem of poor data availability, but for those who manage to gain access to existing records, it can be a powerful tool for transparency.

For more information, check out the links and video below:

Tech for Transparency: New Interviews Posted

Avid readers of my blog (here’s looking at you, Rev) may remember that several months ago I announced that research was beginning for the second phase of the Technology for Transparency Network. The first phase consisted of interviews with over 30 projects around the world who are using technology to promote transparency and accountability in the government and/or private sector. Our goal in the second phase was twofold: to double the number of case studies on the site and to expand the geographic regions we covered.

Since then, I’ve been largely silent about the project — we’ve been working so hard to complete and edit the interviews that I haven’t had much time to breathe. But today I’m thrilled to announce that we have eight new case studies online, with lots more to come over the next few weeks. The case studies that have been posted so far are:

Accountability Initiative
Accountability Initiative researches and creates innovative tools to promote transparency and accountability in India’s public services.

Amatora mu Mahoro
Amatora mu Mahoro (“Peaceful Elections”) is an Ushahidi-based project created to monitor Burundi’s 2010 elections.

Association for Democratic Reforms
ADR India works to monitor national elections through country-wide SMS and helpline campaigns and an informational website.

Democrator.ru
Democrator.ru seeks to empower citizens by helping them collectively send petitions and inquiries to government bodies.

Excelências
Excelências fights corruption in the Brazilian government by publishing data about politicians and government activities online.

Golos
Golos (Voice) has introduced several online tools for better election monitoring in Russia.

Mam Prawo Wiedzieć
Mam Prawo Wiedzieć helps Polish citizens access information about their elected representatives in an easy, user-friendly way.

Pera Natin ‘to!
Pera Natin ‘to! (It’s Our Money!) encourages Filipino citizens to report times when they are asked for bribes.

In addition to continuing to post new case studies (you can subscribe to our case study feed via RSS), we’ll also be publishing our final report on both phases of the project by the end of the month. In the meantime, check out @techtransparent and our Facebook page for daily updates and our podcast for interviews with the project leaders!

Tech for Transparency, v2

Today we officially launched the second phase of the Technology for Transparency Network, a Rising Voices project that documents and maps projects around the world that use online technology to promote transparency and accountability.

Technology for Transparency Network

During the first phase, which ran from January to May of this year, we mapped 37 case studies from Central & Eastern Europe, China, Latin America, South Asia, Southeast Asia and anglophone Sub-Saharan Africa. Between now and September, we’ll be nearly doubling that number and expanding our focus to include projects from the Middle East & North Africa, the former Soviet Union and francophone Africa.

Researchers from the Technology for Transparency Network present at the 2010 Global Voices Summit in Santiago, Chile. Photo courtesy of FabsY_ on Flickr.

I am psyched to be co-heading the project along with the formidable and talented Renata Avila. We’re thrilled to be working with an amazing team of researchers and advisors, including our new editorial advisor Hzel Feigenblatt. Hazel is the Media Projects Director at Global Integrity and will be working with us to make sure we interview the most innovative and exciting projects in this space.

If you have an idea for a case study, let us know! We’re currently taking suggestions in English, Spanish and Portuguese. You can also subscribe to our RSS feed to get updates when we publish new case studies, follow us on Twitter (@techtransparent) and become a fan on Facebook.

Mobile Money: A Recap

I’m spending today at the Macroeconomics of Mobile Money conference at the Columbia Institute for Tele-Information (CITI).

Liveblogging. Please excuse misrepresentation, misinterpretation, typos and general stupidity.

James Alleman is giving closing comments. His takeaways:

There is a class that’s underserved by formal banking, and there are a lot of people who are ready to use mobile banking services in the developing world.

The success of existing services like M-Pesa and Menekse Gencer is impressive and lays a good foundation for future efforts. It appears as though mobile banking efforts will need a formal banking partner to be truly successful.

We still don’t have a good idea of what kinds of regulatory systems are going to be required.

Anonymity is a major question: balancing privacy with criminal threats.

Right now, security is an afterthought. This is not good.

There are too many “standards” right now.

User interface is key.

High demand for VoIP; the people who are demanding this (migrant workers, those who want to send remittances back to their families) are also good candidates for mobile banking services.

Walled gardens are a huge problem: everyone wants a piece of the action, but no one wants to cooperate.

Overall: this is a huge, growing, untapped market with many issues left to be resolved.