e dot dot dot
home << Policy << auto content moderation case study friendster battles fakesters 2003

Sat, 17 Apr 2021

Content Moderation Case Study: Friendster Battles Fakesters (2003)
Furnished content.


Summary: While the social media/social networking space today is dominated by Facebook, it's interesting to look at how Facebook's predecessors dealt with content moderation challenges as well. One of the earliest social networks to reach mainstream recognition was Friendster, founded by Jonathan Abrams in 2002 and launched in early 2003, gaining millions of users who signed up to connect with friends. Originally built as a dating site, it expanded quickly beyond that.One of the first big content moderation questions that the site faced was whether or not to allow fakesters. As the site grew rapidly, one popular usage was to set up fake accounts -- these were accounts for completely made up fictional characters (e.g., Homer Simpson), concepts (e.g., Pure Evil), random objects (e.g., Giant Squid), or places (e.g., New Jersey). Researcher danah boyd catalogued the different types of fakesters and studied the phenomenon of fake accounts on the site.

However, Abrams quickly decided that fakester went against the ethos of the site he envisioned. In a 2003 article in SF Weekly that discusses the fakester issue, Abrams makes it clear that such accounts do not belong on the site, even if some people find them amusing:
In early July, Friendster's affable chief operating officer, Kent Lindstrom, told me the only fakesters that the company would likely remove would be ones it received complaints about. (On Friendster, users can flag somebody's profile for the company to review, and write comments about why it offended them.) But Abrams shakes his head emphatically when I mention this.No. They're all going, he says, his voice steely. All of them.
As the article notes, the fakesters were often the most active users on the site, but that did not change Abrams' mind about whether or not they belonged there:
Though they are some of Friendster's most ardent fans many spend several hours a day on the site fakesters do everything they can to create anarchy in the system. They are not interested in finding friends through prosaic personal ads, but through a big, surreal party where Jesus, Chewbacca, and Nitrous are all on the guest list. To fakesters, phony identities don't destroy the social experience of Friendster; they enrich it.But fakesters aren't hosting this gig. Jonathan Abrams, the 33-year-old software engineer who founded Friendster to improve his own social life, is and he abhors the phony profiles. He believes they diminish his site's worth as a networking tool and claims that fakesters' pictures often images ripped off the Web violate trademark law. Abrams' 10-person Sunnyvale company has begun ruthlessly deleting fakesters and plans to eventually eradicate them completely from the site.
A few months later, an article in Salon laid out the growing conflict between those who found the fakesters to be fun, and Abrams who remained adamantly against them.
Giant Squid is not alone: Among the "Fakesters" who've signed up for Friendster are Jackalope, God, Beer, Drunk Squirrel, Hippie Jesus, Malcolm X and more than a dozen Homer Simpsons. Just like regular users, they post their photos, blab on bulletin boards and collect friends like so many baseball cards. Some, staying in character, even write gushing testimonials about their friends: What higher endorsement could there be than a few complimentary words from Homer himself? "Better than a cold can of Duff beer ... "But while it may be amusing to invite God himself into your pool of friends and get back the message, "God is now your friend," the founder of the site says that such chicanery only distorts his system."Fake profiles really defeats the whole point of Friendster," says entrepreneur Abrams, interviewed by cellphone as he waited to catch a plane in Los Angeles. "Some people find it amusing, but some find it annoying. And it doesn't really serve a legitimate purpose. The whole point of Friendster is to see how you're connected to people through your friends," he says.
Decisions to be made by Friendster:Questions and policy implications to consider:Resolution: While Friendster continued its fights against fakester profiles, apparently the company's vehement stance against such profiles did not apply to monetization opportunities. In the summer of 2004, some people noticed an advertising campaign on Friendster for the Anchorman movie with Will Ferrell in which his character, Ron Burgandy was suggested as a friend to users.
When asked about this, Friendster tried to frame this situation as different than the fakester issue, saying that it was a new paradigm.
What Friendster is doing with these movie-character profiles is actually a brand-new paradigm in media promotion," Friendster spokeswoman Lisa Kopp said. "We are working directly with a number of production houses and movie studio partners to create film-character profiles, or 'fan' profiles, that allow our users to share their enthusiasm about the film with their friends."
The company also claimed that it wasn't fake because it was done in partnership with the movie studio Dreamworks, which had the rights to the character:
"The issue here is actually about consumer protection," said Kopp. "We do, as a policy, strongly discourage fake profiles. A rogue user hiding behind a Jesus profile, for example, has the potential to abuse the service or users in many ways. In the case of the Anchorman characters, DreamWorks owns the rights to the characters and there is nothing fraudulent about it."
Of course, many of the fakester profiles didn't involve anyone else's intellectual property, so this excuse wouldn't apply to accounts like Pure Evil and Giant Squid.Friendster struggled to grow, in part because its own success overwhelmed its technical abilities. The site was quickly overtaken by MySpace and then Facebook. Since then, other sites, including Facebook, have struggled with the question of whether or not accounts should have real names. While many have argued that such policies discourage bad behavior, studies on this point have suggested otherwise.Originally posted to the Trust & Safety Foundation website.

Read more here


edit: Policy/auto___content_moderation_case_study__friendster_battles_fakesters__2003_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study friendster battles fakesters 2003