Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 15 2013

11:00

San Francisco, a City That Knows Its Faults

Low vacancy, so many homeless people, beautiful old buildings, shuttle buses to Silicon Valley ... and warning, I'm going to talk about earthquakes. If it gets scary, stick with me: There's good news at the end, ways to better understand the specific risks facing San Francisco, and some easy places to start.

Let's Talk Numbers

After the 1989 Loma Prieta earthquake, 11,500 Bay Area housing units were uninhabitable. If there was an earthquake today, the current estimate (from Spur) is that 25% of SF's population would be displaced for anywhere between a few days to a few years. However, San Francisco's top shelter capacity can only serve roughly 7.5% of the overall population. And that is only for short-term stays in places like the Moscone Center. So where would the remaining 17.5% of the population go?

  1. Some people may decide to leave the city and start over somewhere else (something called "outmigration," which is not ideal for the economic health of a city).
  2. And some people take longer-term housing in vacant units around the city. But this is particularly tough in San Francisco because vacancy is currently at an all-time low of about 4.2% vacant units.
  3. This brings us to the most ideal scenario: staying put -- something referred to in the emergency management world as "shelter-in-place."

ground-shaking-map.jpg

ground-shaking-key.jpg

What is Shelter-in-Place?

Shelter-in-place is "a resident's ability to remain in his or her home while it is being repaired after an earthquake -- not just for hours or days after an event, but for the months it may take to get back to normal. For a building to have shelter-in-place capacity, it must be strong enough to withstand a major earthquake without substantial structural damage. [...] residents who are sheltering in place will need to be within walking distance of a neighborhood center that can help meet basic needs not available within their homes."

A recent report from Spur's "Safe Enough to Stay" estimates that San Francisco needs 95% of its housing to be shelter-in-place. But currently there are 3,000 addresses (15% of the population) that are in a scary thing called "soft story buildings."

LomaPrieta-Marina.jpeg

A soft story building is characterized by having a story which has a lot of open space. Parking garages, for example, are often soft stories, as are large retail spaces or floors with a lot of windows.

Live in SF? What you can do:

  1. For starters, find out if your house is on the list of soft story buildings, here. And the SF Board of Supervisors recently voted to pass a "Mandatory Seismic Retrofit Program," which will make residents, or their landlords, mandatory to fix these buildings. Might as well check your block, while you're at it. If you are a renter, contact your landlord. If you're an owner, look into some seismic retrofitting.
  2. Check out what sort of liquefaction zone you're in in the map above. If you're in one of the better zones, plan to stock what you need for at least 72 hours while the bigger emergencies are dealt with.
  3. Sign up here to join other San Franciscans looking for better tools to deal with these issues and we'll keep you up to date. At Recovers, we're trying to help San Francisco prepare -- and prepare smartly.

Have an idea or question? Get in touch. We want to help.

sfrecovers.jpg

Emily Wright is an experience designer and illustrator. She studied at Parsons School of Design and California College of the Arts. Before joining Recovers, Emily was a 2012 Code for America Fellow focused on crisis response and disaster preparedness. She likes pretzels, and engaging her neighbors through interactive SMS projects.

August 28 2011

15:33

LNR - #Irene - A San Francisco perspective

Examiner :: San Francisco residents are following the course of Hurricane Irene and are learning in top news today that the hurricane has now reached the New Jersey Shoreline where winds are expected to reach 75mph. "It's a very scary situation," says San Francisco resident Katelynn Masters.

Additional media - Hurricane Irene and New York City, currently winds of 75mph in photos, and video footage - Click on the image to jump to the Examiner page and to watch the slideshow.

Irene-san-francisco-slideshow-jpg

Continue to read Sheila OConnor, www.examiner.com

June 04 2011

04:38

Google vs. Groupon: Groupon-like Google Offers begins testing in Portland

New York Times Last year, Google tried to buy Groupon but failed. So Google now hopes to beat Groupon. Eric E. Schmidt, Google’s executive chairman, said Tuesday that the company would begin testing Google Offers, a Groupon-like service delivering discounts from small businesses, starting on Wednesday in Portland, Oregon, U.S.. The test will be expanded to San Francisco and New York this summer.

As the company indicated earlier this month, Google Offers will be tied to Google Wallet, a mobile application that allows people to use their phone to pay for purchases.

Continue to read Miguel Helft, bits.blogs.nytimes.com

March 15 2011

16:46

How Dotspotting Began with Colored Dots on Drains in San Francisco

sewer-250.jpgIn some ways you could say that the Dotspotting project started with San Francisco's sewage and drain system. A few years ago I started noticing some strange dot markings on the curbs of city sidewalks, directly above the storm drains like the one you see on the left

But on closer inspection, it turned out they weren't just single dots. They seemed to be dots that had been applied, rubbed off a bit, and reapplied. Like Roman palimpsests, the curbs above drains looked like reusable canvasses -- but for dots, instead of edicts. The image below is one close-up example.

sewerclose-500.jpg

This is the kind of thing that infrastructure dorks like me obsess over for a while... and then inevitably move on to something else. (My fascination with the dots waned as I became intrigued by another piece of urban infrastructure, the cable car, which I've been using on my daily commute for about six months -- and mapping on Dotspotting).

Spotting the Dotter

A few years went by and one day I happened to see a man riding his bicycle down Hyde Street in the Tenderloin. His route and mine coincided in just the right way for me to see him stop at a storm drain, reach down and do something with a pole, make a note on a hand-held device, and slowly bike down to the next drain. Traffic being what it was on Hyde, I was able to catch up with him just in time to see him spraying different colored dots on the curb, over dots the ones that were already there! I was pretty excited.

After a quick conversation later, I discovered that he worked at San Francisco's Mosquito Abatement Team. This is the part of the Public Utilities Commission that checks likely standing water locations in the city and, if they find mosquitoes, deals with them. Of course, they track all of their work on GPS and so forth (stay tuned for some hopeful mappings of that data).

But in order to keep things simple -- and to make it possible to see how long its been since a drain was inspected -- the Abatement Team coordinates their efforts so that they always know whether this inspection period is green, white, pink, and so forth. If a drain's most recent dot is the right color, it's been inspected. If not, it hasn't. It's a simple and elegant solution to a potentially onerous problem.

I love that the simple process of making the same mark over and over again, but making it slightly differently, results in this rich data set that can be understood and read if you've got the information and tools to interpret it. (As a side note, I also love having conversations with people who spray markings on the sidewalk. You think you're bothering them with your questions but I've found they're actually pretty pleased to talk to you.)

Visualizing Water Systems

Like a lot of other things, the water system is a gnarly beast that gets more interesting the more you poke at it. There's a ton of information available on the San Francisco utility website, including the awesome map of its wastewater system below -- it even includes the city's version of the Continental Divide:

Wastewater_System_Overview - 500.jpg

Back to Dotspotting

All of this is a long lead-in to the fact that Herb Dang, who runs the Operations department at SFWater, was a surprise visitor to the CityTracking Conference we've just finished cleaning up from at Stamen HQ. His presence sparked just the kind of conversation I had hoped would happen at the conference: developers interested in digital civic infrastructure talking directly with the people who hold the data and use it every day.

We learned a couple interesting things from Herb.

First, that there's a giant concrete wall around the city on the Bay side that channels all of the wastewater runoff down to the Southeast Treatment Plant (not to be confused with the Oceanside Treatment Plant that was almost renamed the George W. Bush Sewage Plant. ("Besides," locals joked, "if we name the local sewage plant after Bush, then what's left to name after Jesse Helms?")

Second, that any request for data about the location, diameter, and any other information about a public drain pipe in the city has to go through a technical review as well as a legal review. So, in addition to needing to verify that the information is correct, the water department also needs to verify that it's a legit request. You don't want people hoovering up information about drains that they could potentially slip bombs into, for example.

Every single building in San Francisco has their own set of records for water and drainage and sewer connections, and getting information on each one of these generates its own review processes. What that means is that if a team like Stamen wants to make a map of the water infrastructure near our office, we'll need to write a separate subpoena, for each connection. For. Each. Connection.

Herb estimated that, within a stone's throw from the studio where the conference was held, there were about 40,000 sewer connections. "40,000 subpoenas" became a catch-phrase for the rest of the day.

How We're Implementing This Info

In any event, I'm still catching up with all of the interesting discussions that were had during the conference, but that's a pretty good representative sample. It's also a nice way to segue into the design work we've been doing on Dotspotting, which I'll be demoing briefly at South by Southwest this week.

There are a number of pretty substantial improvements, but what I'm most excited about at the moment are some big changes to the interface and overall visual look of the thing. Mapping 311 requests for Sewer Issues in District 10 used to look like the cropped image below.

dotspotting_old - 500.jpg

Now, with some swanktastic custom cartography that Geraldine and Aaron have been working on, and visual and interactive improvements that Sean and Shawn (I know, weird, right?) have been polishing and making right, it looks more like the shot below. We'll be pushing this work live some time before my SXSW presentation.

dotspotting_new - 500.jpg

So, onwards we go. Upload yer dots!

February 25 2011

16:38

Attend the San Francisco Interactive Media Summit at the University of San Francisco

Aspiration is partnering with the Department of Media Studies at the University of San Francisco to deliver a day of social media and new technology trainings at the San Francisco Interactive Media Summit, 5 March at USF.

The Aspiration team will facilitate sessions that include “Facebook and Twitter 101”, “Intro to blogging”, Beginning and Intermediate WordPress”, “Listening Online with a Social Media Dashboard”, “Managing Online Channels with a Publishing Matrix”, and “How to Build a Good Nonprofit Web Site for Almost Nothing.”

Visit the Summit event page to learn more and sign up.

February 17 2011

16:10

LocalWiki Tries "Open-From-The-Start" Development Process

It's time for another project update. We've been hard at work on the core of the software that will power LocalWiki. We've also been spending time running around meeting people passionate about local media and planning out many things to come.

Basic groundwork laid

whiteboard_small.jpg

Many of you know about the Davis Wiki, but what you may not know is that we developed the custom software that powers it ourselves. Back in 2004, there was just nothing else that could do everything you see on the Davis Wiki while being easy enough for most people to use. Developing the custom software was well worth the effort, but the more we learned along the way, the more we wished we could change some of the choices we had made early on and build on a better foundation.

When we got the opportunity to embark on the LocalWiki project, we knew this was our chance to take another look at the core of wiki software and rebuild it using today's technology and the lessons we learned from years of experience and analyzing our other wiki engines' code. At the most basic level, one of the things we learned was that providing even simple wiki features like editing and versioning pages was difficult and cumbersome. What's worse, if done wrong these things make it downright painful for developers to add more complex features. For instance, while there may have been lots of code to help save and track versions of pages, that code couldn't be used to help someone save and track versions of map points.

By laying a solid foundation for the LocalWiki software, we'll not only make it easier for others to create basic wiki-like systems but that code will also allow us to go farther with our vision of making the best software for local communities to collaborate on information. In the past couple of months we've written an extensive versioning system for the Django framework that will allow us to simplify later development; explored and refined ways to show changes between different objects, especially rendered HTML pages; began the work on our graphic editor interface; and did lots and lots of research on different technologies.

Opening up our development process

We want the LocalWiki project to have an open-from-the-start development process. As such, while the code isn't ready for casual contributors quite yet, we are fully opening our development process. While we have experience working in the open-source world, one thing we're new at is working full-time alongside other folks. We'll probably make some mistakes, but we want to get this right.

Are you an experienced developer who wants to get involved? Please sign up for our developer mailing list. We'll be sending out a super-geeky developer update in the next day or two.

A (tiny) space to call our own

philip_painting_small.jpg

A little over a week ago, we moved into a little hole-in-the-wall office space. After working out of a coworking space for the first two months, we felt we could be more productive without the distractions that come with sharing a space with so many (admittedly, incredibly nice and professional) people. The space in San Francisco's Mission District is tiny and barely fits two desks, but it's quiet, it's convenient, and we can stay here late into the night working. After spending a weekend furnishing it, the new space has made a huge difference in our comfort, communication, and ability to work for hours on end without interruptions. It also turns out to be cheaper than the coworking setup, which is a nice bonus.

What about the Kickstarter funds?

Several months ago, we faced a serious issue: The Knight Foundation committed funding to the software development aspects of the LocalWiki project, but essential outreach and education aspects were unfunded. With your help, we raised an absolutely essential fund through Kickstarter.com to support outreach and education in pilot communities.

Our plan for the Kickstarter fund is to hold on to it until we begin the outreach and education phase of the project, which will happen shortly after the first pilot community is selected.

February 09 2011

21:13

San Francisco GiveCamp Planning Meeting

 

GiveCamp is a national program sponsored by Microsoft, O’Reilly, Telerik and many other companies involved in software development. GiveCamp events are weekend long hackathons where developers volunteer their time to create software solutions for nonprofits. We are currently in the planning stages for a GiveCamp event in downtown San Francisco. Anyone who is interested in participating should attend the planning meeting on the evening of Tuesday, February 15 at the Microsoft office on Market Street. Please see our Meetup group for details and to sign up for the meeting.

We’re looking for designers and developers regardless of your platform and specific expertise; if you don’t have software development skills to share you can still be a great help by becoming an assistant organizer or project manager. We also encourage non-profits to attend the planning meeting to make sure we can put the right team together to fit your needs. 

 

21:10

'Data and Cities' Conference Pushes Open Data, Visualizations

When I entered Stamen's offices in the Mission district of San Francisco, I saw four people gathered around a computer screen. What were they doing? Nothing less than "mapping the world" -- not as it appears in flat dimension, but how it reveals itself. And they weren't joking. Stamen, a data visualization firm, has always kept "place" central to many of their projects. They achieved this most famously through their crimespotting maps of Oakland and San Francisco, which give geographical context to the world of crime. This week they are taking on a world-sized challenge as they host a conference that focuses on cities, interactive mapping, and data.

As part of a Knight News challenge grant, this conference is part of Stamen's Citytracking project, an effort to provide the public with new tools to interact with data as it relates to urban environments. The first part of this project is called dotspotting, and is startling in its simplicity. While still in early beta stage, this project aims at creating a baseline map by imposing linkable dots on locations to yield data sets. The basic idea is to strike a balance between the free, but ultimately not-yours, nature of Google Maps and the infinitely malleable, but overly nerdy, open-source stacks that are out there.

dotspotting crop.jpg

With government agencies increasingly expected to operate within expanded transparency guidelines, San Francisco passed the nation's first open data law last fall, and many other U.S. cities have started to institutionalize this type of disclosure. San Francisco's law is basic and seemingly non-binding. It states that city departments and agencies "shall make reasonable efforts" to publish any data under their control, as long as the data does not violate other laws, in particular those related to privacy. After the law passed unanimously by the Board of Supervisors (no small feat in this terminally fractious city), departments have been uploading data at a significant rate to our data clearinghouse website, datasf. While uploading data to these clearinghouses is the first step, finding ways to truly institutionalize this process has been challenging.

Why should we care about open data? And why should we want to interact with it?

While some link the true rise of open data movement with the most recent recession, the core motivation behind this movement has always been inherent to the nature of a citizenry. Behind this movement is active citizenship. Open data in this sense can mean the right to understand the social, cultural, and societal forces constantly in play around us. As simultaneously the largest consumers and producers of data, cities have the responsibility to engage their citizens with this information. Gabriel Metcalf, executive director of SPUR (San Francisco Planning and Urban Research), and I wrote more about this, in our 2010, year in review guide.

Stamen's Citytracking project wants to make that information accessible to more than just software developers but at a level of sophistication that simultaneously allows for real analysis and widespread participation. Within the scope of this task, Stamen is attempting to converge democracy, technology, and design.

Why is this conference important?

Data and Cities brings together city officials, data visualization experts, technology fiends, and many others who fill in the gaps between these increasingly related fields.
Stamen has also designed this conference to have a mixture of formats, from practical demonstrations, to political discussions, and highly technical talks.

According to Eric Rodenbeck, Stamen's founder and CEO, "This is an exciting time for cities and data, where the literacy level around visualization seems to be rising by the day and we see huge demand and opportunity for new and interesting ways for people to interact with their digital civic infrastructure. And we're also seeing challenges and real questions on the role that cities take in providing the base layer of services and truths that we can rely on. We want to talk about these things in a setting where we can make a difference."

Data and Cities will take place February 9 - 11 and is invitation-only. In case you haven't scored an invitation, I'll be blogging about it all week.

Selected Speakers:

Jen Pahlka from Code for America - inserting developers into city IT departments across the country to help them mine and share their data.

Adam Greenfield from http://urbanscale.org/ and author of Everyware. Committed to applying the toolkit and mindset of interaction design to the specific problems of cities.

Jay Nath, City of San Francisco
http://www.jaynath.com/2010/12/why-sf-should-adopt-creative-commons, http://datasf.org

January 19 2011

18:30

“Gee, you guys are spending an awful lot of money”: The Bay Citizen editor on funding quality news

Seven months into its bid to reinvent the metro newspaper, The Bay Citizen, the San Francisco-based nonprofit news site, has so far raised a total of $14.5 million in philanthropic gifts, rolled out daily online news and culture coverage with a 26-person-staff, and, during November, attracted a monthly audience of approximately 200,000 unique visitors. It’s on track to spend $4 million during its first year.

I interviewed editor-in-chief Jonathan Weber in The Bay Citizen’s downtown San Francisco office, and later by e-mail and over the phone, to find out what he’s learned from the site’s first half-year of operation — editorially and financially. This is the first in a two-part series.

“There is nothing especially virtuous about being broke”

In a world where many local nonprofit startups are shoestring operations run by refugees from downsized or shuttered metro papers, The Bay Citizen’s relatively large budget continues to attract scrutiny — and some hostility. (As a quick comparison, the national investigative nonprofit ProPublica spent approximately $9.3 million last year, and the local civic news outlet Voice of San Diego spent approximately $1 million.)

“I’m honestly mystified as to why so many journalist-commentators seem to think that spending real money on journalism is a bad thing,” Weber told me. “I’ve been there, and there is nothing especially virtuous about being broke.” Moreover, he said, “I would challenge anyone to take a hard look at what we do — and I mean really dive in in a serious way over a period of time — and tell me that we are wasting money.”

F. Warren Hellman, the San Francisco investor who provided $5 million in seed money for The Bay Citizen, initially described it as a journalistic mainstay during the “inevitable” demise of local newspapers, and said it “might put journalism, broadly defined, on a much more stable foundation.”

Since then, the outlet has emerged as a general interest site for the entire Bay Area: It provides lists of weekend events, covers breaking news, and has even commissioned local author and artist Dave Eggers to produce a series of whimsical sketches of a World Series game. Instead of focusing, as most sites do, on a smaller geographical area, or a content vertical (like the Gawker Media blogs, or NPR’s local, topic-based Argo blogs, which launched this fall), The Bay Citizen is assuming the entire portfolio of a print paper.

“Others might disagree, but I have never seen any critique related to what we actually do journalistically,” Weber said. “It’s sort of this abstract, ‘Gee, you guys are spending an awful lot of money’ — and that kind of criticism makes no sense to me.”

The latest debate over The Bay Citizen’s finances came late last month, after an item in the Chronicle detailing (and mocking) The Bay Citizen’s solicitation of $50 memberships implied that the outlet had spent all its $5 million in seed money — rather than the $4 million it had actually spent. (The Chron item also didn’t mention the additional $9.5 million the organization had raised.) Other journalists involved in smaller nonprofit and local news ventures tweeted their skepticism, including Howard Owens, publisher of the online-only Batavian in western New York, who wrote, “My question is, why do they need more than $1mill operational cost per year in SF?”

Weber responded that for a staff of 26, a $4 million budget was reasonable. (Steve Katz, publisher of the San Francisco-based nonprofit magazine Mother Jones, backed up that math.) But The Bay Citizen is also finding ways to amplify the work of its staff. Perhaps its most innovative step so far has been to position itself as a partner and umbrella site for the Bay Area’s many hyperlocal blogs.

“A different philosophical view about partnership”

The content on The Bay Citizen’s website is the product of a “range of different relationships,” Weber notes. On the front page, for instance, there are articles by staff reporters and paid freelancers. There is also content from the outlet’s community blog partners, who typically get paid $25 for every article The Bay Citizen re-posts from their sites. (The re-postings also appear on pages that are branded with the blog partners’ names and three additional links to articles on their homepages.) Weber has said repeatedly that he wants The Bay Citizen to be “a connector and a hub for an emerging ecosystem” of local blogs.

The site also features a Citizen Blog, which is open to pretty much anyone who wants to blog on local topics. (The Chron features a similar mix of content on its homepage, including citizen blog posts and stories from local partner sites, together with national wire stories, a “Daily Dish” of entertainment news, sports coverage, photo slideshows, and, of course, lots of advertising.) The Bay Citizen’s homepage features a single ad, as well as a jar of change with the slogan “$1 a week helps. Save Independent Reporting.”

The Bay Citizen’s local blog partnerships also include joint reporting projects between staffers and outside bloggers. The finished articles run both on the Bay Citizen and the local blog. They’re partnerships, Weber said, that can bring together the inside-baseball knowledge of local bloggers with the bigger-picture political perspective of staff reporters. “We have a different philosophical view about partnership and the role of non-staff people of various descriptions, and what role they play in the bigger project,” he notes. “I think traditionally mainstream media organizations have always had a religious view that ‘all news comes from here’ and ‘we don’t really publish other people’s news,’ and we definitely don’t.”

The Bay Citizen has also found “a sweet spot in mid-range enterprise news,” Weber said, as in its story about a payment scandal in the San Francisco Unified School District. These aren’t three-month, “capital I-investigative reporting” projects, as Weber put it, but quicker stories that might need only a single records request to pull together. (The Center for Investigative Reporting and its offshoot California Watch, which specialize in long-term investigative reporting projects, are right across the Bay in Berkeley.)

The value of business experience

While the idea for The Bay Citizen was conceived at a time when the San Francisco Chronicle was hemorrhaging millions and seemed close to shutting down, the outlet is now competing with a more stable Chronicle (whose print circulation, at last reporting, was 223,549 on weekdays) as well as a slew of other Bay Area news outlets, large and small. It’s doing so with the ambitious plan of leveraging its first few years of philanthropic funding into the kind of popular support that makes public broadcasting-style membership drives viable.

For all that, Weber said, employing a large staff — with business-side as well as journalistic expertise — makes sense. “The rationale on staff size is pretty simple,” he notes. “If you’re going to bite off something big and ambitious like doing daily and enterprise news and multimedia on a wide range of subjects for a large region, and producing 2 pages twice a week for The New York Times, you need the people to do it. ‘Big’ is a relative term. We have a big staff compared with New West or many other local start-ups, but we’re very small compared with any metro newspaper, and also smaller than ProPublica and CIR, as comparisons.”

While the $400,000 salary of Lisa Frazier, The Bay Citizen’s CEO, has generated particular criticism ever since it was announced last year, Weber has repeatedly said that “journalists tend to undervalue business experience.” And he told me that The Bay Citizen’s four-part revenue plan — which starts with large gifts and grants, and then aims to ramp up membership revenue over several years, bringing in additional money through syndication and underwriting — is complicated enough to need a sophisticated business manager. He also noted that The Bay Citizen’s ability to raise so much money in large gifts is indicative of the fact that major donors feel more comfortable giving to organizations with experienced businesspeople at the helm.

How does Weber expect it all to pay off? “By creating a great news operation that produces and supports important and interesting journalism and attracts a wide audience, which in turn will create financial support.”

October 29 2010

20:39

5Across: Politics in the Age of Social Media

news21 small.jpg

5Across is sponsored by Carnegie-Knight News21, an alliance of 12 journalism schools in which top students tell complex stories in inventive ways. See tips for spurring innovation and digital learning at Learn.News21.com.

As more people use social media such as Twitter and Facebook, politicians and campaigns need to put more time, energy and money into reaching people there. According to the E-Voter Institute, 80% of people who are avid social network users consider themselves to be occasionally or very active in politics. And 34% of them rely on social networks for general information, up from 29% last year. (You can get more statistics and data on social networking use and politics in this great MediaShift report from Anthony Calabrese.)



mediashift_politics 2010 small.jpg

So for this month's episode of 5Across, I brought together people involved in politics and social media, looking at it from many angles. A local San Francisco politician, Phil Ting, discussed what he calls "user-generated government" and how online discussions can help shape policy. We also talked about the importance of being authentic on social media, and we questioned why campaigns continue to spend billions of dollars on TV ads while barely spending anything online. Finally, we discussed the exciting advent of open data from local and federal governments in the U.S., and the rise of mobile apps in campaigning -- and even fixing potholes. Check it out!

5Across: Politics + Social Media

polishift.mp4

>>> Subscribe to 5Across video podcast <<<

>>> Subscribe to 5Across via iTunes <<<

Guest Biographies

Ngaio Bealum describes himself on Twitter as "a comedian, magazine publisher, juggler, musician, parent, activist, Sacramentan, and a great cook. I also like hard beats and soft drugs." Bealum has been actively supporting the California initiative, Proposition 19, to legalize marijuana in the state.

Marisa Lagos covers state politics and government for the San Francisco Chronicle, including elections, the legislature and issues such as prisons and welfare. Over the past year her coverage has ranged from stories on the attorney general race and budget crisis to sex offender laws and legislation aimed at making sure consumers know whether they are wearing faux fur or raccoon dog (seriously). Previously, she worked at the Los Angeles Times and SF Examiner. She has written exclusively for the web, blogged and used social media to promote her work.

As communications and media director, Mary Rickles spends her days writing about Netroots Nation and getting others to do the same. She has a unique background in both traditional and new media, having worked as a reporter and with campaigns, agencies, non-profits and corporate companies on projects ranging from brand development to community outreach. She previously was communications director for the grassroots powerhouse Democracy for America and in 2009 was named one of New Leaders Council's Top 40 Under 40 Emerging Leaders. Mary grew up in Birmingham, Ala., where she got her first taste of politics by volunteering for Don Siegelman's gubernatorial campaign.

As Assessor-Recorder of San Francisco, Phil Ting is a reformer whose efforts have enabled him to generate over $245 million in new revenue for San Francisco.
Ting began his career as a real estate financial advisor, working at Arthur Andersen and CB Richard Ellis. Prior to serving as the Assessor-Recorder, Ting also had a long history of civil rights advocacy -- he was the executive director of the Asian Law Caucus. He is past president of the Bay Area Assessors Association and has served on the board of Equality California Institute.

Theo Yedinsky started Social Stream Consulting, a social media and political strategy firm and is a partner in the Oakland-based social media firm, North Social. In 2009, Theo Yedinsky served as the new media director for San Francisco Mayor Gavin Newsom's campaign for Governor of California. At the time, Mayor Newsom's campaign boasted the largest Facebook and Twitter following for a non-presidential Democratic candidate in the country. Prior to joining the Newsom campaign, Theo served as the first executive director of the New Politics Institute, a think-tank designed to study the increasing impact of technology and new media in political campaigns. Prior to launching the New Politics Institute, he managed Simon Rosenberg's campaign to become chairman of the Democratic National Committee and worked extensively on Senator Kerry's campaign for President.

If you'd prefer to watch sections of the show rather than the entire show, I've broken them down by topic below.

User-Generated Government

Authenticity Online

The Power of Facebook

Buying Ads Online

Open Data and Mobile Apps

Credits

Mark Glaser, executive producer and host
Corbin Hiar, research assistant

Jason Blalock, camera

Julie Caine, audio

Location: Vega Project & Kennerly Architecture office space in San Francisco

Special thanks to: PBS and the Knight Foundation

Music by AJ the DJ

*****

What do you think? Which politicians are doing the best job of utilizing social media? Which mobile apps are helping you get local information? Share your thoughts in the comments below.

Mark Glaser is executive editor of MediaShift and Idea Lab. He also writes the bi-weekly OPA Intelligence Report email newsletter for the Online Publishers Association. He lives in San Francisco with his son Julian. You can follow him on Twitter @mediatwit.

news21 small.jpg

5Across is sponsored by Carnegie-Knight News21, an alliance of 12 journalism schools in which top students tell complex stories in inventive ways. See tips for spurring innovation and digital learning at Learn.News21.com.

This is a summary. Visit our site for the full post ».

October 27 2010

19:00

MediaBugs revamps its site with a new national focus

When it launched in public beta earlier this year, MediaBugs, Scott Rosenberg’s Knight News Challenge-winning fact-checking project, was focused on correcting errors found in publications in the Bay Area. Today, though, Mediabugs.org has undergone a redesign — not just in its interface (“just the usual iterative improvements,” Rosenberg notes), but in its scope. Overnight, MediaBugs has gone national.

Part of the site’s initial keep-it-local logic was that, as a Knight winner, the project had to be small in scope. (The News Challenge stipulates that projects focus on “geographically defined communities,” although this year they’ve loosened up that rule a bit.) But part of it was also an assumption that community is about more than geography. “My original thesis was that, first of all, it would be valuable to work on a small scale in a specific metropolitan area,” Rosenberg told me — valuable not only in terms of developing personal relationships with editors who oversee their publications’ correction efforts, but also as a way to avoid becoming “this faceless entity: yet another thing on the web that was criticizing people in the newsrooms.”

And while the community aspect has paid off when it comes to newsroom dealings — Rosenberg and his associate director, Mark Follman, have indeed developed relationships that have helped them grow the project and the cause — MediaBugs has faced challenges when it comes to “community” in the broader sense. “It’s been an uphill battle just getting people to participate,” Rosenberg notes. Part of that is just a matter of people being busy, and MediaBugs being new, and all that. But another part of it is that so much of the stuff typical users consume each day is regional or national, rather than local, in scope. When he describes MediaBugs to people, Rosenberg notes, a typical response will be: “Great idea. Just the other day, I saw this story in the paper, or I heard this broadcast, where they got X or Y wrong.” And “invariably,” he says, “the X or Y in question is on a national political story or an international story” — not, that is, a local one.

Hence, MediaBugs’ new focus on national news outlets. “I thought, if that’s what people are more worked up about, and if that’s what they want to file errors for,” Rosenberg says, “we shouldn’t stand in their way.”

The newly broadened project will work pretty much like the local version did: The site is pre-seeded (with regional and national papers, magazines, and even the websites of cable news channels), and it will rely on users to report errors found in those outlets and others — expanding, in the process, the MediaBugs database. (Its current data set includes not only a list of media organizations, their errors, and those errors’ correction status, but also, helpfully, information about outlets’ error-correction practices and processes.)

For now, Rosenberg says, the feedback loop informing news organizations of users’ bug reports, which currently involves Rosenberg or Follman contacting be-bugged organizations directly, will remain intact. But it could — and, Rosenberg hopes, it will — evolve to become a more self-automated system, via an RSS feed, email feed, or the like. “There isn’t really that much of a reason for us to be in the loop personally — except that, at the moment, we’re introducing this strange new concept to people,” Rosenberg notes. “But ultimately, what this platform should really be is a direct feedback loop where the editors and the people who are filing bug reports can just resolve them themselves.” One of the inspirations for MediaBugs is the consumer-community site Get Satisfaction, which acts as a meeting mechanism for businesses and the customers they serve. The site provides a forum, and it moderates conversations; ultimately, though, its role is to be a shared space for dialogue. And the companies themselves — which have a vested interest in maintaining their consumers’ trust — do the monitoring. For MediaBugs, Rosenberg says, “that’s the model that we would ultimately like.”

To get to that point — a point, Rosenberg emphasizes, that at the moment is a distant goal — the MediaBugs infrastructure will need to evolve beyond MediaBugs.org. “As long as we’re functioning as this website that people have to go to, that’s a limiting factor,” Rosenberg notes. “We definitely want to be more distributed out at the point where the content is.” For that, the project’s widget — check it out in action on Rosenberg’s Wordyard and on (fellow Knight grantee site) Spot.us — will be key. Rosenberg is in talks with some additional media outlets about integrating the widget into their sites (along the lines of, for example, of the Connecticut Register-Citizen’s incorporation of a fact-checking mechanism into its stories); but the discussions have been slow-going. “I’m still pretty confident that, sooner or later, we’ll start to see the MediaBugs widget planted on more of these sites,” Rosenberg says. “But it’s not anything that’s happening at any great speed.”

For now, though, Rosenberg will have his hands full with expanding the site’s scope — and with finding new ways to realize the old idea that, as he notes, “shining any kind of light on a subject creates its own kind of accountability.” And it’ll be fascinating to see what happens when that light shifts its gaze to the national media landscape. “That dynamic alone, I think, will help some of the publications whose sites are doing a less thorough job with this stuff to get their act together.”

September 10 2010

10:23

San Francisco Online Community MeetUp Wed. 9/22 with speaker Randy Farmer

For our September MeetUp, we are thrilled to have longtime online community expert, Randall Farmer, as our guest speaker. We'll also be joined by Bill Johnston, Head of Global Community at Dell, who will be facilitating the event.

Randy will be speaking about managing reputation systems in online communities. His talk will lead into an informal discussion moderated by Bill. Come join us for an evening of online community strategy and in-person networking with fellow online community enthusiasts.

read more

February 05 2010

12:00

Moving on After the Knight News Challenge

In 2008, the Open Media Foundation (then Deproduction) received a $380,000 Knight News Challenge award, and it was a major turning-point for our organization. We added staff, formed new partnerships, and maintained a level of growth that had us approximately double in size each year over our first five years after forming in 2004.

The Open Media Project grant is for a four-part effort that began with a re-building of the software we developed to automate an unprecedented approach to user-generated and community-powered TV. The second phase saw our team implement this re-built Drupal software and business model in six additional public access stations across the country. Third, we took the lessons learned from the beta-test implementations and released an installation profile that incorporates the contributions and lessons learned in the seven beta-test sites.

The fourth and final phase has our team focused on content-sharing among these stations, enabling us to cooperate as a true network by sharing the top-voted content from each station, and building a collection of truly engaging content unlike anything else you can find on TV. As we tackle this fourth phase, we are also facing the challenge of sustaining this project (and our team) without ongoing support from the Knight Foundation.

Earned Income

From the beginning, we anticipated that the long-term sustainability of the Open Media Foundation would be based primarily on earned income. We hoped the success of the Open Media Project would generate a strong demand from public access TV stations and other organizations looking for support in implementing a similar model. This approach enabled Denver Open Media to thrive even without the general operating support most public access TV stations enjoy from their local government or cable operators.

Our first such client arose in San Francisco after the city drastically cut operating support for public access and then selected the Bay Area Video Coalition to launch their new public access TV stations, SF Commons. We have found a great partner in BAVC. They are now poised to set a new standard for participatory, community media, and are committed to be a part of an open source movement that has each of us benefiting from the investments of the others. The earned income from this project (and others to follow) will hopefully help our team sustain its success and continue to build upon the expertise we've gained over the past five years.

Cooperation and Partnerships

No successful open source project can be carried by a single organization. The Drupal modules we've developed have been downloaded by over 100 organizations, ranging from public access TV to community colleges. Several of these partners have contributed back to the software in ways that are benefiting the entire community. But this hasn't come about easily.

Over the past decade, many public access TV stations have developed open source software, but few projects are built in a way that enables the software to be truly useful in other environments. Our initial foray into Open Media Project tools included myopic code and assumptions that made the software more difficult to leverage in than if we'd started from scratch.

Developing the code in a manner that makes it useful in diverse environments involves a sacrifice that few organizations have been willing or able to make. It requires investing resources in development that we hope will pay off in the future when partners use and contribute back to the code.

Early partners made the same mistake as us, investing hundreds of thousands of dollars into code that is practically useless in any setting other than their own. The Knight News Challenge award enabled us to take the time to better collaborate with the Drupal
community, host code sprints, attend conferences, and, ultimately, back-track and re-design a more extensible code base.

With our grant period soon coming to an end, we have a number of partners poised to take the reigns and collectively help ensure the continued growth of the project. Davis Media Access in California has devoted significant time to improving the code and is a clear success story. Their work has, among other things, extended the OMP code to integrate with a new broadcast server.

Our growing relationship with Tightrope Media Systems, and their recent commitment

to open source software
, can largely be credited to the efforts of Darrick Servis and Davis Media Access. Other successes and failures of the beta test process are equally valuable. Ongoing cooperation with Boston Neighborhood Network, Channel Austin and others will continue to yield benefits to the project.

We're most excited about our newest partner: the Bay Area Video Coalition. They bring a commitment to open source collaboration that we've not yet seen in previous partners. Everything about their SF Commons effort gives us confidence that they will set a new example for the next generation of networked, user-driven public access TV. Though their operating support is meager, they have strong, visionary leadership in Ken Ikeda and Jen Gilomen. They also stand to benefit from their close proximity to organizations like Archive.org, Creative Commons, and the Wikimedia Foundation, all of whom inspired our software and business model from the beginning.

Even if the Open Media Foundation were to shut our doors, I'm confident that organizations like BAVC would keep the project alive and growing... of course, we're working on making sure that isn't the case.

Expanding the Open Media Project

While earned income has the potential to maintain the level of activity we've enjoyed here for the past two years, our true vision of building an entirely new kind of participatory media network is going to require a significant ramp-up of the project. The Broadband Technologies Opportunities Program funding available through the stimulus plan represents a once-in-a-lifetime opportunity to do just that.

We partnered with Free Speech TV, the Alliance for Community Media, and 20 other public access TV stations across the country to apply for $2.2 million to expand the Open Media Project. The proposal addresses the many lessons learned from our Knight-funded beta test, and proposes a more self-contained and supported solution that can transform a wide range of public access TV stations into gateways for broadband adoption for disconnected communities.

Statistics show that the primary factor preventing individuals from using broadband is not a lack of infrastructure, but the perception that the Internet is not relevant to their life. Our partner stations will encourage and support these communities by conveying the relevance of broadband access from the perspective of those communities. Together with Free Speech TV, we will collect the best of this content and provide national exposure to perspectives on broadband's relevance that simply haven't been seen before.

In case our first round application doesn't receive funding, we've invested heavily in planning an application in response to the second BTOP opportunity for funding. I encourage other Knight News Challenge grant recipients (and rejectees) to read the Notice of Funds Availability and investigate if their Knight News Challenge project would be a candidate for BTOP funding.

Regardless of future grants and funding, we are optimistic about the future of the project. We've had our share of pitfalls, but that's to be expected when you're pioneering new territory. The Knight News Challenge experience has opened doors and helped our organization grow in a way that will forever alter our work. If we can sustain the project beyond our KNC award, we'll be part of an entirely new kind of non-commercial media system, serving interests and engaging communities that are left out of the commercial media conversation.

Every change begins with a new conversation.

Reblog this post [with Zemanta]

January 07 2010

20:53

AN ONLY 20-DAY LONG BLOG ABOUT THE APPLE TABLET LAUNCH

2010-01-07_2038

I am starting tomorrow in www.lainformacion.com an only 20-day long blog in Spanish about the launch of the new Apple tablet.

I will ended in San Francisco on January 27th when Steve Jobs presents the tablet.

In less than 24 hours, Mario Tascon, Vanesa Jimenez, Jorge Martin Luengo and Antonio Pasagali from Diximedia made the miracle.

What a team!

And thanks too, to Luis Grañena that will draw one superb caricature every day just for the blog.

See above the first one about Steve Jobs,

January 06 2010

09:50

WHY THE APPLE iSLATE WILL CHANGE THE MOBILE INTERNET MEDIA MARKET

apple-table-ipad-itablet-macbook-touch9

If you need a few reasons to understand why the Apple iSlate is going to lead the new Mobile Internet Media Market Revolution, just read these four slides from a recent Morgan Stanley report:

1. MOBILE INTERNET IS THE NEXT BIG MEDIA TREND, SO MOBILE INTERNET DEVICES LIKE THE APPLE iSLATE WILL LEAD THE NEWS MEDIA CONSUMPTION

Tech Cycles las 10 Years

2. THE NEW MOBILE INTERNET DEVICES WILL BE MORE THAN JUST PHONES

New Copmputing Cycles

3. AND MOBILE PHONE USAGE IS MORE AND MORE ABOUT DATA, NOT VOICE.

Mobile Phone Usage is Data not Voice

4. MOBILE INTERNET IS RIGHT NOW THE FASTEST GROWING COMMUNICATIONS TECHNOLOGY EVER LAUNCHED

Mobile Internet Adoption

So, if you are in the newspaper business, let me suggest to read this report.

Why?

Because you need to include the Mobile Internet card if you want to play in the media games of the future.

It’s a key-strategic issue.

And that’s the reason that on January 27th, I will be in San Francisco when Apple launches the multimedia platform of the future.

Steve Jobs knows that this is going to be the most dramatic new product presentation of his life.

And this why yesterday Apple announced the acquisition of Quattro, a leading Mobile Advertising company.

Get here the full Mobile Internet Report.

(Thanks to Kerry J. Northrup)

November 20 2009

14:00

The FTC should give nonprofit news a closer look

You know the old saying about how we’re from the government and we’re here to help you? That’s what came to mind as I read the Federal Trade Commission’s notice for its workshop on journalism in the digital age.

The notice makes the case that “news organizations,” which it notably does not attempt to define, are suffering at the hands of aggregators and other online actors that have drained the fun and profit from news gathering. Among the solutions the FTC wants to examine are some that would seem to support nonprofits — tax treatment and greater public funding, for example.

Memo to the FTC: No thanks.

It’s not that the FTC’s proposed solution are so bad, though I don’t much like the idea of government funding non-broadcast news operations. It’s that they provide fresh fodder for misinformed critics who have come to the conclusion that nonprofits pose a threat to for-profit news sites and journalism generally.

Mention “nonprofit” to some of these folks, and you’re likely get an allergic reaction. No sooner had San Francisco investor Warren Hellman ponied up $5 million for the Bay Area News Project than somebody complained errantly that the new venture would rely on unpaid college students, forcing other media to cut staff to remain competitive. News flash: Old media aren’t competitive in the online age, and that isn’t the fault of Warren Hellman or any nonprofit. Others fretted that donated money like Hellman’s comes with agendas and strings attached. And advertising dollars don’t?

But I digress. Nonprofits offer a viable solution to the decline of socially responsible journalism. By design, they put mission ahead of profit. And as a result, they will live or die based on their commitment to transparency. When the government gets involved, it introduces the appearance of special favors and the potential for political interference. That’s the death of transparency.

To be clear, I don’t object to the notion of government oversight. A little can go a long way — witness the FTC’s late-1990s antitrust investigation of Intel Corp. At the time, Intel dominated the computer chip market and, along with Microsoft Corp., seemed capable of devouring anything in its path, much as Google appears today. But just before trial began in 1999, Intel signed a settlement with the FTC in which it admitted no guilt and essentially agreed to be nicer to the smaller kids in the technology sandbox.

Based on this experience, we can assume that what the FTC workshop really hopes to accomplish is to once again nudge the bullies into being nicer. I would submit that there are better ways to accomplish this goal. One might be to bring in witnesses who can explain how the nonprofit model works and how it complements the work of for-profits in journalism and other sectors.

My nomination would go to Duke’s Jay Hamilton, author of All the News That’s Fit to Sell, which is cited in the FTC notice. In the book, Hamilton makes the case that journalism is becoming a public good. He writes:

The point here is that since individuals do not calculate the full benefit to society of their learning about politics, they will express less than optimal levels of interest in public affairs coverage and generate less than desirable demands for news about government.

I do agree with the FTC that the stakes are high because unlike the great oil and steel trusts of old, the big powerhouses of the Internet are in the business of ideas. As Bill Kovacic, then a law professor at George Washington University and now an FTC commissioner, told me during the Intel case: “I think the impact is so important because its impact on information services affects everything we do.”

The FTC workshop will be held in Washington Dec. 1-2.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl