At Facebook, Axten isn't some
fringe employee doing unmentionable work. The 26-year-old Stanford grad is one
of some 150 people the young company employs to keep the site clean—out of a
total head count of 850. Facebook describes these staffers as an internal
police force, charged with regulating users' decorum, hunting spammers and
working with actual law-enforcement agencies to help solve crimes. Part hall
monitors, part vice cops, these employees are key weapons in Facebook's efforts
to maintain its image as a place that's safe for corporate advertisers—more so
than predecessor social networks like Friendster and MySpace. "[They were]
essentially shanghaied by pornography and sexual displays," says David
Kirkpatrick, author of the forthcoming book "The Facebook Effect."
It's a tricky job: by insisting that users sign up under real names and refrain
from posting R-rated photos, Facebook hopes to widen its user base to include
upscale professionals, but at the same time it's aware that too much
heavy-handed censorship could upset its existing members. "If [Facebook]
got polluted as just a place for wild and crazy kids, that would destroy the
ability to achieve the ultimate vision, which is to create a service for
literally everyone," Kirkpatrick says—and then its potential for profits
would disappear, too.Internet companies have long
grappled with illicit postings. As far back as 1993, AOL's "community
action teams" were reviewing e-mail and chat-room activity. Craigslist has
long been beset by ads for prostitution; in November, the site began cooperating
with attorneys general to curb posts to its "Erotic Services"
section, and last month Boston police apprehended a med-school student later
charged with murdering a woman who'd placed a "massage services" ad
on the site. In 2005, as user-generated content platforms exploded at sites
like YouTube, Flickr and Digg, the need to screen content grew rapidly as well,
increasing demand for online cops.