A lawsuit filed against Facebook Inc. on behalf of terrorism victims in Israel illustrates some of the complications of going to court to remedy violent radicalism. Fortunately, there’s a better way to address the problem of militants exploiting social media.
Lawyers for the victims sued Facebook in Manhattan federal court on Monday, seeking $1 billion in damages. They alleged that the U.S. company allowed Palestinian militants affiliated with Hamas, branded by the U.S. government as a terrorist organization, to use the online service to plan attacks that killed four Americans and wounded another in Israel, the West Bank, and Jerusalem. "Simply put, Hamas uses Facebook as a tool for engaging in terrorism," the lawyers wrote. The suit alleged that Hamas has used Facebook to share operational information and instructions for carrying out attacks.
Whatever one's position on the Middle East conflict, it's fair to say that the suit against Facebook faces some serious legal hurdles. First, there is the so-called safe harbor provision of the Communications Decency Act. That measure protects online service providers, such as Facebook, from legal liability related to what their users say. The suit against Facebook argues that the 1992 Anti-Terrorism Act, which prohibits material support to terrorist groups, ought to trump the communications decency law.
The First Amendment's protection of free speech could present a further legal obstacle. Hamas hastened to wrap itself in its mantle. Mushir al-Masri, a senior Hamas leader, said by phone that “suing Facebook clearly shows the American policy of fighting freedom of the press and expression” and is evidence of U.S. prejudice against the group and “its just cause."
In light of these potential legal obstructions, the application of effective blocking technology might work better than litigation. Gabriel Weimann, an expert on terrorism on the internet at Haifa University, said. The focus should be on developing faster ways to detect problematic messages so they can be blocked immediately, before they go viral, Weimann said. “Facebook isn’t the only platform,” he added. “There are plenty of others. What will you do? Sue them all?”
Facebook declined to immediately comment on the lawsuit but said, "there is no place for content encouraging violence, direct threats, terrorism or hate speech on Facebook." The social networking site clearly knows how to do what Weimann recommends. In March, the company took down a page advocating a new Palestinian uprising against Israel because it made "direct calls for violence." Better algorithms applied more aggressively could accomplish far more than long-shot, billion-dollar lawsuits. Indeed, Facebook would be wise to explore a settlement of this case built on a foundation of improved blocking technology aimed at violent fanaticism of all sorts.