This project will be a computational analysis, done primarily in R, of political discourse on Twitter during the US Presidential Elections. A combination of network analysis and discourse analysis, using Bayesian methodologies and sentiment analysis, are currently being explored. See the exploratory code on GitHub here.
I was fortunate enough to write a guest blog on my research at the University of Toronto for Samara Canada, an organization dedicated to enhancing democratic participation in Canada.
Over the last few decades, polls have shown citizens’ decreasing levels of trust and engagement in the Canadian political process. This trend comes at a time when the improvement of online technologies such as forums, social media and chats is allowing a new kind of dialogue between political parties and citizens. Is the online public sphere a viable avenue for reviving citizen engagement and influence in Canadian politics?
Read the full article here.
A revised version of this article will appear in the United Nations’ Internal Voices.
Over the past few years, there has been a steady increase in the number of government open data websites. These sites not only publish large volumes of government data and make it accessible for public download, but they also facilitate access through interactive search and browsing capabilities, simple navigation systems and outreach strategies. While some have expressed optimism as to the potential of government open data, others have been more hesitant, calling for the development of appropriate evaluation tools.
The United States, led by former Chief Information Officer Vivek Kundra, has been a leader in the open data movement, particularly through its index for government data stored on hundreds of other US government websites – Data.gov.
The initiative followed United States President Barack Obama’s publicly available Memorandum on Transparency and Open Government, in which he outlined the three objectives of his administration’s Open Government Directive – transparency, participation and collaboration. A frontrunner in technologies facilitating open government, Data.gov allows citizens to download the data and create novel data mashups. The website has the potential not only to facilitate government-to-government and government-to-citizen communications, but also to foster citizen-led technological innovation.
The United Nations is also a leader in open data practices. The United Nations Development Programme, for example, has for many years provided data on the Human Development Index. Various UN agencies, such as the World Health Organization and the International Telecommunications Union, provide country-level and aggregate data on the themes that they are tracking, such as child mortality and Internet penetration rates.
There have also been many efforts to provide data on the progress and high-level commitments to the Millennium Development Goals. But why are these initiatives important?
Secretary-General Ban Ki-Moon recently talked about the importance of holding UN member countries accountable for their commitments to the Millennium Development Goals, saying: “We cannot afford to leave the poor event further behind.” His comments are reminiscent of those of United Nations Declaration of Human Rights principal drafter John Humphrey, who wrote in 1974: “It seems to me that in so far as human rights is concerned, a solution has been found in what may be called the organization of shame. Most governments, including international governments, are sensitive to world public opinion.”
At stake, therefore, might be important UN objectives: peace-building, social progress and the enhancement of living standards and human rights worldwide.
Three individuals who have greatly contributed to the open government policy debates in the United States are Ellen Miller of the Sunlight Foundation, Beth Novek and Vivek Kundra. In its Sunlight Agenda 2010, the Sunlight Foundation notes that transparency fosters civic participation and that a key feature of this transparency is access to and accessibility of information. Miller states that: “core to the President’s campaign for government transparency is the use of technology in ways that redefine what ‘public information’ means – that is online information, information that is as easily searchable as it is easily accessible”. She also adds that a key feature of the Data.gov website is the capacity it gives to citizens to create aggregate data that affects their daily lives and increase their civic participation.
In a similar vein, Noveck has written extensively on the benefits of open data technology for democracy. Her main claim is that online technologies have provided the opportunity for collaborative governance and innovation, one in which citizens participate through distributed, open source channels.
Finally, Kundra is widely considered to be the initiator of the American open data system at a national level. In public appearances and government reports, he makes it clear that his priority is “to create a runway, a platform for innovation”. Like Noveck, he values giving citizens the tools for innovation in technology and participation in policy. In doing so, he places emphasis on the quality of the platform and the data it hosts.
However, the impact of open data initiatives still remains to be determined. How does one evaluate the success of open data websites to reach national and international objectives?
The academic approach often leads to ethnographic case studies that are time and resource intensive and require a high level of access to the development team and the website users. On the other hand, practitioners tend to use website analytics, that are useful when it comes to tracking visitors and producing reports, but that can’t take into account the greater social objectives of the technologies. This divide between academics and practitioners points to the need for greater collaboration between them, and for a middle ground between qualitative and quantitative evaluation methods.
The United Nations, with its global reach and access to extensive amounts of qualitative and quantitative data, can take a lead in evaluating the effectiveness of open data website development. It can determine new guidelines for the type of data that should be published, the timing for its publication and the appropriateness of different user interfaces. An in-depth examination of the impact of open data websites on international objectives has the potential to provide insights that could have a lasting effect on accountability, efficiency and public engagement worldwide.
The following paper was presented at the Oxford Internet Institute’s Symposium on the Dynamics of the Internet and Society in September 2011.
Over the past few years, the steady increase in the number of government open data websites has led to a call for appropriate evaluation tools. While some (Noveck, 2009) have expressed optimism as to the potential of government open data, others (Coglianese, 2009; Hindman, 2009) have been more hesitant. This paper therefore aims to answer the following question: how does one evaluate the success of open data websites in reaching democratic objectives? In doing so, it explores past academic studies and examines the researcher’s experience with interpretive inquiry. Using Data.gov as an example, it argues that survey-based research, a common tool in information systems analysis, may not be suited to open data websites. Instead, it suggests a content analysis methodology, which hopes to inform future research on the subject.
Drupal was created in 2000 by the then-University of Antwerp student Dries Buytaert. Over the last 11 years, it has grown into a huge community of open-source web developers, providing a stable, flexible content management system for a multitude of websites developed by the White House, the World Food Programme, Pearl Jam, and many more.
There have, however, often been debates about the usability of the back-end in comparison to blogging CMS’ such as WordPress, particularly for the uninitiated user. As most functioning groups or organizations require a website but most don’t have the resources to hire an expert developer, there are many uninitiated users in the workplace.
Having personally migrated two research sites from Drupal to WordPress in the last year, I am currently working on a Drupal project and am enjoying the enormous breadth and flexibility that it offers.
I will echo many developers in saying this: whether Drupal or WordPress is more appropriate depends entirely on the project requirements and available resources. Drupal’s modular build offers the developer nearly unlimited options and opportunities, from social media integration, languages, custom themes and subthemes, users registration, permission sets, forums… the list goes on. However, a Drupal website needs someone to manage it. I’ve seen many untended Drupal websites fall into disrepair, and it’s not a pretty sight.
In the past, I recommended a Drupal to WordPress migration because WordPress is so much more simple and lightweight. For university research project websites, managed by a handful of students each year who have at most 10 hours per week to dedicate to the project, Drupal is just too complicated. In the quick rotation of experimenting research assistants, knowledge is lost, security systems don’t get updated, the public forum gets spammed, and suddenly, the site is obsolete and no longer supported by the university. With a simple framework like WordPress, this is less likely to happen.
On the other hand, any company or organization with the resources to invest in a more complex site should consider Drupal. In a team meeting, a Drupal developer has the luxury of almost always saying “yes” when asked: “is this possible?”. After merely a few days of Drupal on a large scale, I am tempted to buy a t-shirt and spend my weekends buried in manuals. Or perhaps watching the excellent tutorials created by what seems to be a very young Drupal wiz with a knack for simplifying complicated concepts: tomrogers123.
My recommendation? If you have the resources, don’t drop Drupal.
This post is inspired by a professor at the University of Toronto who pointed me to the 19th century saying that there are three kinds of lies: lies, damned lies and statistics. While it’s true that statistics can be used for deceptive purposes, so can data visualizations.
In a July 6, 2011 article for the Torontoist entitled Our Toronto’s Graphics Skew City Budget Information, journalist Stephen Michalowicz argues that poorly designed pie charts and bar graphs make it difficult for citizens to understand the 2011 municipal budget. He writes: “[…] all of the charts and graphs depicted in the newsletter are technically correct – they just don’t provide a full, balanced picture of the City’s finances.”
Having been consulted for the article, I found that examining the City’s charts was a fascinating, if unusual, way to spend a Friday night. The charts were confusing at best, and, as Michalowicz points out, misleading.
In today’s multimedia information environment, there is more to communicating statistics than showing mere numbers, or even ratios and percentages. Artist Chris Jordan, one of many artists and graphic designers who use data visualization for story telling, has created several fascinating art pieces, one of which depicts 48,000 plastic spoons, or the number of gallons of oil consumed around the world every second.
If bad data visualizations can obscure data and mislead the public, good ones make abstract numbers much more tangible and concrete. That’s the point of graphs, charts, or what is often called ‘visual aids’. When looking at the innumerable examples of data visualization around me, I wonder: do they reveal or obscure data? What am I not seeing in this image, and why am I not seeing it?
I worked on the Liberal Campaign during the May 2, 2011 General Elections. As a member of the digital media team, I learned a lot about using social media to enhance public engagement in political processes. Tools such as Twitter and Facebook allow campaigning politicians to converse directly with voters across the country. While I appreciate the value of town halls and face-to-face interactions, in a country as big as Canada, it can be difficult for federal candidates to reach out to voters, and social media allows for a personal, informal touch that doesn’t come through in broadcasting and print media.
Here are a few of my favourite digital media moments of the Liberal Party of Canada’s Campaign.
- The Liberal Platform Launch: using an online town hall format, we launched the platform live to an in-person and online audience. During the hour-and-a-half launch, several thousand citizens logged into the Cover It Live chat and asked questions, which we did our best to field and answer. Several questions were answered directly by Michael Ignatieff, just as if they had been asked by members of the live audience.
- The Rise-Up Video: this speech was picked up by both traditional and social media outlets as soon as it was given, at a Town Hall in Sudbury. My role was to put in the French captions, as francophones quickly asked to have access to a version of the speech with subtitles. As of today, 118,000 people have viewed the video on YouTube.
- The Montreal Facebook Town Hall: we decided to stream this event on Facebook rather than on Cover It Live in order to make it easier for youth to participate. The live event took place at Presse Café on Parc, in Montreal, with Michael Ignatieff, Justin Trudeau, and other local candidates. While the event was being streamed live, participants could comment on the Facebook page and ask questions. I moderated the questions and picked out about half a dozen of them (and not just the easy ones!) which I read to Mr. Ignatieff. Since his answer was then broadcast on Facebook, there was no need for me to type it in, but I frantically wrote a short translation, as the questions were both in French and in English. I thought that the live event/Facebook combination was really exciting.
As part of my work as a research assistant for Dr. Andrew Clement at the University of Toronto’s Faculty of Information, I have been facilitating participatory design workshops. Here is our first research video, made of our first “low-fidelity” workshop held at the Knowledge Media Design Institute in December 2010.
The Future Workshop was developed for participatory design of information systems by Junkt and Mullert in 1987 (McPhail et al, 1998). It consists of three phases – critique, fantasy and implementation which enable potential users of an information system to expose flaws in their current system, bring to light their ideal system or features and decide what they would like to see prototyped.
On February 19, 2011, Alexandra Hall and I went to Peterborough, ON to lead a Future Workshop with staff, volunteers and other stakeholders at Kawartha Heritage Conservancy, a land, biodiversity and cultural heritage conservation organization. Our objective was to engage in a participatory design session which would shed light on our development of an Intranet to support the organization’s knowledge management strategy.
Although we intended to begin the workshop with an icebreaker, our time was limited and we skipped right into the critique phase, during which participants pointed to weaknesses in their current file organization and sharing system. For the fantasy phase, we were inspired by a session organized by Terry Costantino at Usability Matters earlier this month. The participants imagined their ‘dream’ information system and wrote down each idea on a post-it which they then put on the wall. Afterwards, we had intended to use a “dot-mocracy” exercise for the implementation phase, where participants would each vote for their favourite post-it using 5-10 dot stickers. However, again in the interest of time, we ended up having an open discussion of the proposed features instead. All in all, it was a very informative session and I look forward to showing them the first prototype.