Those keeping a close watch on jihadist threats and recruiting on social media sites say Twitter is largely ignoring calls to block tweets from Islamic extremists, and note that the famous social networking company has done little to stop accelerated calls for attacks around the July Fourth holiday and during Ramadan.

Counterterrorism groups point to a late June call for attacks from the spokesman for the Islamic State, Muhammad al-Adnani, as the latest example of how the Islamic State has been able to weaponize Twitter.

"We need not look further than the attacks in Kuwait, Tunisia, France and Egypt over the past week for evidence of how concerning this threat really is," said Mark Wallace, the CEO of the Counter Extremism Project, a nonprofit formed to combat the threat from extremist ideologies.

Last fall, CEP launched a social media campaign, using the hashtag #CEPDigitaldisruption, to target and shut down hundreds of terrorists' Twitter and Facebook accounts. In late June, the group opened offices in Berlin and Brussels to expand its reach and influence, and translate tweets and threats emanating from Europe.

But while the group says it has tried to engage Twitter, the social media giant has been "dismissive to the point of dereliction," Wallace said. Twitter has done little even after the Islamic State and other extremist groups have used the social media platform to circulate beheading videos, or brutal slayings such as the burning alive of a Jordanian pilot back in January, or the slow drowning of prisoners in a cage in late June.

June 29th marked the one-year anniversary of the Islamic State of Iraq and Syria's declaration of a caliphate straddling those two countries. CEP says the U.S. and its international coalition of partners is "nowhere near beating ISIS" and Twitter is doing little to combat the terrorists' use of their site to put out nearly 90,000 tweets a day recruiting new jihadis to their cause.

"We've seen little progress from Twitter on confronting this dangerous and pervasive problem," Wallace told the Washington Examiner. "The first step is admitting you have a problem — in Twitter's case, a terrorist problem."

"We can't even get them to sit down and have a conversation about the steps we can take together to combat the issue," he continued. "They have stuck their heads in the sand while terrorists continue to weaponize the platform and reach places and individuals they couldn't reach before."

A particularly vexing problem for social media services is the fact that users whose accounts are shut down can easily morph into similar-sounding new accounts and resume abusing others or spreading terrorist messages and threats.

Critics argue that's not an excuse — that the company needs to devote more time and resources to tracking and canceling these accounts, and to make the service less anonymous. Anyone can sign up for Twitter, they point out, using an email account, even if the account is fake and untraceable.

For instance, the two men killed by a police officer as they shot at a Prophet Mohammed cartoon contest in Garland, Texas in early May left an extensive online Twitter trail. One of the gunmen, Elton Simpson, regularly called for violence on Twitter in the weeks before the attack, which were shared with avowed enemies of Pamela Geller, the organizer of the cartoon contest.

One of his Twitter contacts was Mohamed Abdullahi Hassan, a Somali-American Islamic extremist recruiter now in Somalia who uses the name Mujahi Miski for his Twitter accounts. Hassan specifically called for jihadi violence at the Texas event.

Twitter had shut down Hassan's account several times only to have it spring back up with a similar-sounding moniker. On April 23, 10 days before the Texas attack, Hassan linked to the cartoon event in Texas and praised the January shootings at the Charlie Hebdo newspaper in Paris and called on jihadists in the United States to follow suit.

"The brothers from Charlie Hebdo attack did their part," he wrote. "It's time for brothers in the #US to do their part."

After the attack, Twitter took down at least one of Hassan's accounts.

In late June, as the Supreme Court was readying its decision on gay marriage, an Islamic extremist in France with the account named Younes le deserteur tweeted: "I would really like to push a homosexual from the top of a building."

CEP immediately launched a campaign aimed at trying to engage other users to report him so Twitter would shut down the account. It took six days and the work of several human rights and calls from LGBT activists before Twitter did so.

A Twitter spokesman defended the site's record and methods of attempting to shut down accounts that promote hate and violence.

"We review all reported content against our rules, which prohibit unlawful use, violent threats and the promotion of terrorism," the spokesperson said a statement to the Examiner.

The service has a process whereby people can report instances of hate speech or abuse, but critics argue that it can take five, 10 or even 30 days for Twitter to remove content and shut down accounts that are clearly being used to spew hate and violence.

Complaints aren't ranked or prioritized and are simply placed into a massive docket that Twitter employees respond to in days and weeks, not hours.

If Twitter were to take aggressive action against terrorists using its site to spread hate and violence, the results would be extremely effective, critics argue.

For instance, after a British newspaper revealed the identity behind the most popular ISIS-affiliated accounts called "Shami Witness" to be an Indian businessman with 18,000 Twitter followers, he immediately stopped his online jihad and Indian authorities started hunting him down.

Twitter didn't respond to a follow-up question about specific reforms to its process CEP and others are seeking.

At a House Foreign Affairs subcommittee hearing earlier this year, CEP offered several practical steps Twitter could take, including instituting a "trusted reporting status" to governments and outside groups to expedite pressing complaints against certain accounts.

So far, Twitter's most vigorous defense of its counterterrorism policies came after a bipartisan group of leaders on the House Foreign Affairs Committee sent the CEO a letter in early March asking Twitter to develop a program to streamline the reporting of terrorist activity so it can "quickly block content and accounts that support terrorism."

"We commend Twitter's strong commitment to free speech in the United States and around the world," the wrote. "However, when Twitter accounts are used to support terrorism, such content does not deserve protection."

In a response to that letter, Twitter said it shares the lawmakers' concerns about threats of violence on the Internet by terrorist organizations and said, "like many, we have been horrified by atrocities perpetuated by such extremist groups."

The letter alternated between stressing how difficult it is to manage the problem on a open platform site with 280 million users worldwide with one billion tweets a day, to defending its rules and attempts to shut down the extremist accounts.

Twitter noted that it has teams around the world providing "24-7 coverage" that receive reports from our users and "appropriate law enforcement agencies."

"We take this job seriously and we have expanded — and continue to expand — these teams," Twitter said without saying exactly how many people are engaged in the process.

"Twitter's open platform for expressions must always seek to strike a balance between the legitimate needs of law enforcement, the enforcement of our own Twitter rules, as well as the ability of users to share freely their views — including views that many people may disagree with or find abhorrent," wrote Vijaya Gadde, Twitter's general counsel.