At Government 2.0 Camp, a discussion was held as to what would be needed to constitute a successful transparency push by the Obama Administration.
They came up with an interesting list that included the desire for open data, a central repository for Freedom of Information Act Requests and how they're responded to, open meetings, open government research, and searchable, crawl-able, accessible data.
This desire for raw data, data that can be easily read and manipulated by computers, is a natural outcome of the growth of internet technologies and mashup culture.
When the government produces data that isn't easily readable by computers, it takes individuals a lot of legwork to read and understand it. But with the government as large as it is, a mere human can never fully grasp the reams of data produced by the government annually. That is why we must insist on getting government data in formats readable by computers.
While an investigative journalist scouring over piles of inscrutable data might be able to find some correlation between campaign contributions and vote positions, doing so might take months of tedious research. But add a computer into the equation and within a few hours somebody can program it to take all campaign contributions and all vote tallies and produce voting patterns in hours instead of months.
Consider watchdog.net. Taking Ron Paul's page as an example, you can see the aggregation of data from Project Vote Smart, Wikipedia, Congressional Biographical Directory, Google Maps, FEC, VoteView, GovTrack.us, and OpenSecrets.
With raw data in hand, there is no limit to what creative-minded developers/designers/activists can accomplish.