A threat to health is being weaponised: inside the fight against online hate crime – The Guardian
In the winter of 2002, nine months before Hanif Qadir unpacked his bag at a terrorist training camp in Afghanistan, a group of men walked into the London MOT testing centre he owned with his brothers. They were collecting money for civilians caught up in the US invasion of Afghanistan; hundreds of children had been orphaned by indiscriminate bombing, the men claimed. Could he help? The appeal resonated with Qadir, who had lost his father when he was seven. He made a donation.
The men returned regularly. Each time, they asked for more money, before gradually changing the subject to Qadirs faith. Eventually they invited him to a meeting at a local house to discuss the war in Afghanistan more freely. I felt they were sincere and genuine, Qadir recalls. At the meeting, the men encouraged Qadir to visit websites that claimed to show photographic evidence of violence against Afghan civilians by western troops.
Qadir browsed hundreds of distressing images, among them scores of orphans, each accompanied by extended captions that described the way in which the childs family had been killed. One girls story has remained with him. The website claimed she had lost 21 members of her family to a stray US missile. The caption explained it had taken locals three days to scrape their remains from the walls of the girls home. The more he saw, the closer Qadir became to the men who were, unbeknown to him, recruiters for al-Qaida.
If someone calls for genocide against Muslims, theyve essentially tattooed a swastika to their forehead online
Qadir grew up in Thornaby-on-Tees, a small town in North Yorkshire. After his father died, he had disengaged from school, leaving at 14 and moving to London. After a few odd jobs, he founded a business with his brothers, buying, repairing and selling cars. By the early 2000s, the business was profitable enough that he was able to donate generously to charitable community causes, a reputation that, he believes, led the recruiters to his door.
The suggestion that Qadir travel to Afghanistan was seeded gently. When a person is radicalised they become suggestible, he tells me. We discussed that, in order to prevent more loss of life, we needed to be prepared to fight. On 2 December 2002, he flew to Islamabad in Pakistan. A few days later, he crossed the border into Afghanistan.
Soon after he arrived at a training camp, Qadir saw a man measuring up children who lived there. I thought they were being tailored for new clothes, he recalls. Then he heard one of the leaders telling the children they would soon be reunited with their dead parents. They were being fitted for suicide vests. I felt sick and angry, he says. I wanted to walk away.
But in the middle of a desert compound patrolled by armed guards, any attempt to defect could be fatal. Qadir was trapped. I knew that if I asked to leave things would end badly. He had to think carefully.
***
In 2002, when Qadir was being radicalised, the internet was not yet ubiquitous. There was no Twitter, no Facebook; websites looking to groom people into supporting extremist causes were obscure. Two decades later, the digital landscape has been transformed. As the All-Party Parliamentary Group on Hate Crime wrote last year, the internet has become a key breeding ground for extremism and hate speech emboldened by the increasing ease of dissemination, anonymity and, thanks to outdated legislation, a lack of meaningful consequences.
Perpetrators of terrorist attacks now routinely leave online statements or manifestos to justify their actions, hoping their words might encourage others. The 28-year-old gunman who killed 51 mosque-goers in Christchurch, New Zealand, last year posted a 73-page white nationalist rant to the fringe web forum 8chan and livestreamed the attack on Facebook.
But now, just as Facebook and Twitter have become the prodigious muck-spreaders of our age, a handful of clandestine startups are using technology to stem the flow. Moonshot, whose office is at a secret location in London, is, at five years old, a veteran in this emerging industry. Its premises have the feel of a typical Silicon Valley operation: distressed floorboards, glass-fronted offices, beanbags by an open fireplace, exposed brickwork, a snug for breathers. There are a few clues that the companys business using technology to disrupt violent extremism is different from that of the fitness app developers, social media influencers and virtual reality speculators with whom it shares an aesthetic. The posters are not vintage prints but disquieting infographics revealing, for example, that after 22 people were shot dead in an El Paso Walmart last August, there was an 82% rise in the Google search term how to murder Mexicans. There is also a bomb-proof door.
Cofounder Vidhya Ramalingam set up the EUs first intergovernmental research initiative to investigate far-right terrorism in the aftermath of the 2011 murder of 77 people by Anders Breivik in Norway. She describes Moonshots work as experimental programming. The company employs 50 people, and uses a mixture of software and human judgment to identify individuals on the internet who, like Qadir, appear interested in extremist propaganda. They then attempt to serve them counter-messaging.
The technology uses a database of indicators of risk. An individual is awarded risk points according to their online behaviour. You score one point for showing curiosity about the Ku Klux Klan or National Socialist Movement. Activity that indicates sympathy with a violent movement or ideology (eg Googling white pride worldwide) earns three points, while showing a desire to join, send money to, or commit acts on behalf of a violent extremist group or individual earns six.
Home Office initiatives such as Prevent have traditionally focused on training teachers and other leaders to identify people likely to be drawn to violent extremism within their communities but these methods risk introducing discriminatory practices. In France, for example, there were posters telling people their sons might be at risk of violent extremism if they grow a beard, start speaking Arabic or stop eating baguettes, explains Ross Frenett, Moonshots cofounder. That is obvious bullshit.
By contrast, Frenett says, if someone makes a post glorifying Hitler, or calls for genocide against Muslims, there is a high degree of certainty that they fall into a high-risk category. Theyve essentially tattooed a swastika to their forehead in the online space, he says. So our level of confidence when identifying individuals who are vulnerable to radicalisation is way higher online than it could ever be offline. And it sidesteps some of the discriminatory, stigmatising practices weve seen in an offline setting.
Moonshot, founded in September 2015, is a for-profit company that earns its income from government contracts in the UK, US, Canada, Australia and across western Europe. It does not limit its work to any particular strain of radicalism; in addition to the far-right and jihadism, Moonshots work covers everything from Buddhist extremism in south Asia, to Hindu nationalism and incel terrorism in Canada.
The skill is in finding out what raised a persons interest in extremist ideology. You cant redirect them until you do
At the broadest level, Moonshot runs what it refers to as redirection material advertising that is designed to get in front of extremist material in Googles search results. Google has granted Moonshot dispensation to advertise against banned search terms such as join Isis. If a user clicks on one of Moonshots camouflaged results, they are taken to, for example, a mental health website with relevant downloadable guides and a chat option. (These sites are run by partnered mental health organisations and groups that have experience dealing with gang violence. As Frenett puts it, they have appropriate risk protocols, and connections with law enforcement, should they be required.) So long as the search terms are carefully calibrated (advertising against white power is useless, Frenett explains, as you end up competing with power-tool companies) this can be an effective first contact.
Success is measured in much the same way as any company seeking to advertise on Google, via click conversions. (We pay for advertising just like any commercial advertiser does, Ramalingam says. We dont get special rates. I wish we had a better story on that front.) A key metric is search impression share, which records the amount of time your at-risk audience saw the ad. Weve had campaigns that have run with only 50%, and thats not good enough, Ramalingam says. So we work hard to get that up to 98% where possible. For this reason, as well as mental health practitioners and ex-police officers, Moonshot also employs marketers. Most of our work is analytics, marketing and social work, Frenett says. It just happens to be marketing, analytics and social work related to terrorism.
Occasionally the company will identify an individual who is too high risk for their interventions. Thats where, depending on the country were working in, we refer a user to the police, Frenett says. In Australia, for example, Moonshot identified someone at the top of a network of around 200 at-risk individuals considered so risky we couldnt intervene. A few days later, the local police arrested the man, who was subsequently convicted on terror charges.
There are deeper kinds of intervention. One of Moonshots advertisements for, say, bomb manuals will take the searcher to a WhatsApp chat manned by a specialist trained in deradicalisation techniques. The company may also identify someone on a particular social media platform openly espousing pro-extremist or pro-terrorist views. Then a trained social worker, typically from a charitable partner organisation, contacts that individual via Twitter direct message or Facebook Messenger. Choosing the right person to make this kind of contact, which may be perceived as invasive, is essential. In many cases, the right person is a former extremist someone like Hanif Qadir.
***
When Qadir realised the children in the Afghan training camp were being measured for suicide vests, his first instinct was to exact revenge on the people who had manipulated him. But I only had a knife, no gun, he says. And I knew that I couldnt tell anyone I wanted out.
He stepped outside the gate of the camp to consider his options. There he spotted the driver of a pick-up truck with whom he had talked a few times. The men did not share a first language, but Qadir gambled. He pulled out 50 and waved it at the man, asking if he could hitch a ride to Turkham, on the border between Pakistan and Afghanistan. The driver nodded. Qadir climbed into the passenger seat. I didnt even collect my bag, he recalls.
Qadir says he cried during the flight to London. I kept asking myself: What the hell have I done? When he arrived home, he and his brothers attempted to find his recruiters, but they had disappeared; the word was that they had moved to Manchester. Qadir decided he no longer wanted to run the car business and convinced his brothers to sell up. I just wanted to stay at home with my children.
After a period of recuperation, he and his brothers opened a gym in a disused nightclub, which became a place where local youths, many of them young Muslims, would congregate. Wed talk, he says. Id ask them questions about Afghanistan. I saw a lot of anger and questioning. It was clear to me that all it would take is for one person to manipulate them emotionally and they would get straight on a plane to fight. Or maybe they would do something here.
Eager to communicate this to someone in a position of power, Qadir started attending local council meetings. A police inspector, Ian Larnder, took him for a coffee, hoping to better understand why this former mechanic seemed so passionate about the subject. Until then, I had told nobody about what had happened, Qadir recalls. Ian was the first person I opened up to. A week later, Larnder was appointed to the polices national community tension team. He took Qadir with him to speak to forces around the country about his experiences.
Today, with a number of other former extremists, Qadir works with Moonshot, where he provides training for online interventions. The skill is in finding out what has raised a persons interest in extremist ideology, he explains. You cant redirect a person until you understand this. Its no good asking something so broad as: What do you think about what is happening in India? It has to be specific and personable. So instead you might say: Is it permissible to seek revenge for the loss of a loved one?
This sort of broad line of questioning and the fact that an anonymous dialogue might tail off, without scope for any follow-up can seem frustratingly opaque for anyone trying to measure Moonshots success. Its a criticism the company is used to fielding. The struggle with preventive work is that, very often, its unscientific and we have to ask people to take it on trust, Frenett says. Its easy for a military contractor to come in and say, I installed a big, high fence and a man with a gun and that reduced terrorism. Likewise, the army can come along and state: We killed 200 Taliban this week.
But its much harder to say, OK. We invested $1m here and we prevented this much terrorism. Our long-term aim is to start to change that calculation. Then well be able to say: If one dollar in every 100 spent on military hardware went towards targeted, community-focused preventive work it would be better value and probably better for the world.
***
In the corner of a chilly room at the end of a corridor in Cardiff Universitys Glamorgan Building, a flood of racial slurs, misogyny, antisemitism and far-right slogans flows across a PC screen. Imagine you had a crystal ball in which you could watch someone perpetrating every hate crime as it occurred somewhere out there, on the streets, explains Matthew Williams, director of HateLab. Thats what youre looking at here, except the hate is happening online.
While Moonshot and Qadir intervene with individuals who are vulnerable to extremism, HateLabs aim is to provide a more accurate picture of hate speech across the internet. It is, Williams says, the first platform to use AI to detect online hate speech in real time and at scale.
Moonshots ads for, say, bomb manuals take the searcher to a WhatsApp chat manned by a deradicalisation specialist
Online hatred is so commonplace that the majority of incidents go unreported. According to British government data, 1,605 hate crimes occurred online between 2017 and 2018, a 40% increase on the previous year. But the Home Office admits this figure is probably a gross underestimate.
Unlike the police, we dont have to wait for a victim to file a report, Williams says. The program reflects a true indication of the prevalence of online hatred.
It offers a granular indication, too. Williams specifies a date range, then picks from a filter of potential target groups: Jews, homosexuals, women, and so on (misogyny is by far the most prevalent form of hate speech on Twitter, he says). He selects anti-Muslim and a heat map of the UK lights up in red blotches showing geographical hotspots. Elsewhere, it reports the average number of hateful posts per minute and the peak times of day (hate speech, the group has found, is most prevalent during the daily commute, when people read and react to the days news).
A word cloud indicates the most-used anti-Muslim slurs, while a spiderweb visualises a network of perpetrators, identifying the thought leaders who are generating the most retweets, and how they are linked, via online accounts. HateLab gives situational awareness to hate speech on Twitter at any given time, Williams says.
Early last month, HateLab identified three forms of coronavirus-related hate speech: anti-Chinese or Asian; antisemitic, focused on conspiracy theories; and Islamophobic, focused on accusations of profiteering. What we are seeing is a threat to health being weaponised to justify targeting minority groups, no matter how illogical the connections may seem, Williams explains.
(Moonshot has monitored similar rises in hate speech targeting Chinese nationals. The hashtag #ChinaLiedPeopleDied was tweeted 65,895 times in March, while #coronavirustruth, implying that the pandemic is a hoax, was used 77,548 times. The company also picked up tweets showing old videos of Muslim men leaving mosques accompanied by text claiming the footage was filmed during quarantine, a seemingly deliberate attempt to create anti-Muslim sentiment.)
Williams, author of a forthcoming book titled The Science Of Hate, is a professor of criminology at Cardiff, but his interest in the field is not purely academic. In 1998, he travelled to London with friends to celebrate a birthday. At some point during the evening, he stepped out of the gay bar in which the group was drinking. Three young men approached. One asked if Williams had a light. As he handed over his Zippo, the man punched him in the face. Williams returned to his friends but said nothing, fearing that they would want to retaliate. Eventually, one of them noticed blood on his teeth and urged him to report the attack. I said no, Williams recalls. At that time my parents didnt know I was gay. My siblings didnt know, and neither did most people from my town. I didnt want to come out to the police.
But Williams returned to Wales a changed person. Any attack on your identity has a profoundly destabilising effect, he says. I became angry and depressed. I modified my behaviour. I stopped holding my boyfriends hand. I still wont show affection in public. He was not alone in failing to report his attackers; based on the combined 2015/16 to 2017/18 Crime Survey for England and Wales, only 53% of hate crime incidents came to the attention of the police. People are fearful of secondary victimisation, Williams says.
As domestic internet use became more commonplace, Williams noticed the hate speech he encountered on the streets reflected online. The difference was that it was there for everyone to witness. Fellow academics were initially sceptical of his preoccupation with online behaviour, but by 2011 everyone knew hate speech was the key problem of the internet. That year, Williams received a lottery grant of more than half a million pounds to accelerate his research.
Every social media platform represents a torrent of information too deep and wide to sift by hand. Williams and his team began by taking a random sample of 4,000 tweets from a dataset of 200,000. The trove was then handed to four police officers, trained to recognise racial tensions, who each evaluated whether every tweet was discriminatory. If three of the four officers concurred, the tweet was classified as hate speech. Over a four-week period, the officers identified around 600 tweets they deemed discriminatory, data that formed the gold standard by which the AI would test if a message was malignant or benign.
You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed
On the afternoon of 22 May 2013, when fusilier Lee Rigby was killed by two Islamist converts in Woolwich, London, the software had its first live test. Within 60 minutes of the attack, Williams and his team began harvesting tweets that used the keyword Woolwich. As the software sifted the data, the team was able to examine the drivers and inhibitors of hate speech, and identify accounts spreading anti-Muslim rhetoric. The team found that hate speech peaked for 24-48 hours, and then rapidly fell, while the baseline of online hate remained elevated for several months. Astonishingly, this was one of the first times a link between terror attacks and online hate speech had been demonstrated. And importantly, an increase in localised hate speech both anticipated the attack and, in the aftermath, shadowed it, showing that it might be possible to predict real world attacks.
The data fascinated social scientists, but Williams believed it was more than interesting: it could have a practical application in helping counter these narratives. In 2017, he began a pilot scheme with the national online hate crime hub, which was set up to coordinate reporting into this area. It now uses the HateLab dashboard to gauge ebbs and flows in the targeting of particular groups, as well as nuances in local tensions. This information can then inform operational decisions, helping direct frontline police work.
There are obvious privacy concerns, and HateLab must comply with data protection regulations. The platform depends on the willingness of Twitter to make its data available to third-party applications. (Facebook closed down open access in 2018, so independent organisations cannot screen its posts.) Twitter shares data on the proviso that HateLab does not identify individual accounts via its dashboard. In that sense, we can only provide the 10,000ft view, Williams says. The dashboard can highlight patterns, target groups and geographical hotspots but connecting with individuals is outside its remit.
Meanwhile, Qadir and the other former extremists working alongside Moonshot recognise the power that hate speech can have, and know firsthand that a conversation can steer someone down a more positive path. You can only change people if you can reach them via conversation, he tells me. Violent extremists do this very cleverly, and evidence shows that it works for them, so I based all my programmes on this concept. You have to engage and create conversations, but direct them positively allow for grievances to be heard and discussed.
Since Moonshot was founded, there has been a radical shift in the perception of technologys role when it comes to extremist terrorism. Five years ago, there were still people inside the government who thought tech was for the kids, Frenett says. There was a sense that it was almost amusing that terrorists were on the internet. You dont get that any more. Likewise, five years ago there were some great organisations doing great work on the violent far-right, but again it was almost seen as niche. Thats no longer the case.
See more here:
A threat to health is being weaponised: inside the fight against online hate crime - The Guardian
- The Rules of Social Media Marketing Have Changed. Heres What Works (and What Doesnt) in 2025. - entrepreneur.com - October 24th, 2025 [October 24th, 2025]
- Airbnb continues focus on travel experiences with new social features - Marketing Dive - October 24th, 2025 [October 24th, 2025]
- In the News: Toronto eyes World Series title, social media safety and Alberta election process - Brock University - October 24th, 2025 [October 24th, 2025]
- Survey: Social media tops ad spending as HK market shrinks 5% in Q3 - Marketing-Interactive - October 24th, 2025 [October 24th, 2025]
- NBAs Opening Night and Return to NBC Dominates Coverage on Social Media - NBA - October 24th, 2025 [October 24th, 2025]
- Fast 50: The Social Lights returns to The List - The Business Journals - October 24th, 2025 [October 24th, 2025]
- Why MTX stock is trending on social media - 2025 Market Outlook & Weekly Watchlist of Top Performers - nchmf.gov.vn - October 23rd, 2025 [October 23rd, 2025]
- Bibliometric analysis of Integrated Marketing Communications (20152024): knowledge structures, emerging themes, and future perspectives - Nature - October 23rd, 2025 [October 23rd, 2025]
- William Sawalich to miss Martinsville Xfinity race; Bonsignore to drive No. 18 car - NASCAR.com - October 23rd, 2025 [October 23rd, 2025]
- Want to be that brand everyones following? Northeasterns Social Media Summit provides the insight - Northeastern Global News - October 23rd, 2025 [October 23rd, 2025]
- Sponsored Love: How SMM Panel Directories Are Transforming Social Media Marketing In 2025 - Harlem World Magazine - October 23rd, 2025 [October 23rd, 2025]
- Best Practices To Share Marketing Assets with Venues and Fans - Hypebot - October 23rd, 2025 [October 23rd, 2025]
- im in charge now: Why Dunkins viral Spidey D ad campaign is the future of marketing - The Boston Globe - October 23rd, 2025 [October 23rd, 2025]
- The viral Essex businesses finding fans on TikTok and Instagram - BBC - October 21st, 2025 [October 21st, 2025]
- I've built a seven-figure business through content creation, here's how you can make millions online too - Daily Mail - October 19th, 2025 [October 19th, 2025]
- 'Touch grass': Social media reshaped elections, but is it still the tool it once was? | CBC News - CBC - October 19th, 2025 [October 19th, 2025]
- Social media ads now the top brand discovery channel for 16-34-year-olds - Performance Marketing World - October 19th, 2025 [October 19th, 2025]
- Lysol taps Snooki for social campaign promoting the StinkCheck - Marketing Dive - October 17th, 2025 [October 17th, 2025]
- Exploring the relationship between after-hours work-related social media use and teacher job burnout - Nature - October 17th, 2025 [October 17th, 2025]
- EU MEPs agree on recommendation for Europe to 'make full use of its powers' to ban loot boxes for minors and social media for those under 16 - PC... - October 17th, 2025 [October 17th, 2025]
- Snapchat Shares Insight into Evolving Beauty Shopping Trends - Social Media Today - October 17th, 2025 [October 17th, 2025]
- Social Media Automation Tool Market Projected to Achieve USD - openPR.com - October 17th, 2025 [October 17th, 2025]
- Why Target is embracing social-first marketing for its Woolrich collab - Marketing Dive - October 15th, 2025 [October 15th, 2025]
- Pushback on governments social media ban with two months to go - AFR - October 15th, 2025 [October 15th, 2025]
- MIA explores the future of independent distribution and the role of festivals, platforms and social media - Cineuropa - October 15th, 2025 [October 15th, 2025]
- Creators Are Drawing Big Crowds With IRL Events [Infographic] - Social Media Today - October 13th, 2025 [October 13th, 2025]
- Navigating the Crypto Storm: Internal Risks and Social Media's Role - OneSafe - October 13th, 2025 [October 13th, 2025]
- 4chan unlikely to be included in Australias under-16s social media ban, eSafety commissioner says - The Guardian - October 11th, 2025 [October 11th, 2025]
- Social media is changing live events: Heres what the numbers say - Marketing Dive - October 11th, 2025 [October 11th, 2025]
- UK universities offered to monitor students social media for arms firms, emails show - The Guardian - October 11th, 2025 [October 11th, 2025]
- This small business made $1 million in a year with marketing and social media - Business Insider - October 9th, 2025 [October 9th, 2025]
- Californias AI Transparency Act (CAITA) May be Amended to Regulate Social Media Platforms - Crowell & Moring LLP - October 9th, 2025 [October 9th, 2025]
- A social media reminder to be cautious about sharing online - University of Nebraska Medical Center - October 9th, 2025 [October 9th, 2025]
- Google Expands Virtual Try On to Footwear - Social Media Today - October 9th, 2025 [October 9th, 2025]
- Social club The Nanson nabs ex-Hagen-Dazs Asia marketing head as CEO - Marketing-Interactive - October 9th, 2025 [October 9th, 2025]
- TikTok Announces Improvements to Its AI-Powered Smart+ Ad Campaigns - Social Media Today - October 9th, 2025 [October 9th, 2025]
- 100PLUS hands creative and social media remit to its media AOR - Marketing-Interactive - October 7th, 2025 [October 7th, 2025]
- Social Media Marketing Conference to Take Place on 16 October in Stellenbosch - Media Update - October 7th, 2025 [October 7th, 2025]
- The synthetic scroll has arrived with Meta's Vibes and OpenAI's Sora marketers are watching nervously - Digiday - October 7th, 2025 [October 7th, 2025]
- Snapchat Highlights the Opportunities of Sponsored Snaps - Social Media Today - October 7th, 2025 [October 7th, 2025]
- Inside CeraVes social-first partnership with the NBA - Marketing Dive - October 7th, 2025 [October 7th, 2025]
- Chef and entrepreneur Gemma Ogston on shrooms, social media and why adaptogens arent a trend - The Independent - October 7th, 2025 [October 7th, 2025]
- Meta Outlines Expanding AI Ad Tools, Including Chatbots for Websites - Social Media Today - October 4th, 2025 [October 4th, 2025]
- OpenAI wants to build a social-media business. Can its Sora app take on Meta and Google? - MarketWatch - October 2nd, 2025 [October 2nd, 2025]
- Need a Social Media Influencer for Your Brand? Theres an AI for That - Inc.com - October 2nd, 2025 [October 2nd, 2025]
- Xena and the future of social media: Finding growth in a saturated market - London Business News - September 30th, 2025 [September 30th, 2025]
- Snapchat Shares Data on the Effectiveness of its First Impression Ads - Social Media Today - September 30th, 2025 [September 30th, 2025]
- UN Security Council Vote and Its Impact on Irans Tether Market - WANA News Agency - September 28th, 2025 [September 28th, 2025]
- The Best Times To Post On Social Media In 2025 - Startups.co.uk - September 28th, 2025 [September 28th, 2025]
- Murad and Ricoh appoint social media agency, Thermos MY names brand and digital agency - Marketing-Interactive - September 28th, 2025 [September 28th, 2025]
- Countries Consider A.I.s Dangers and Benefits at U.N. - The New York Times - September 25th, 2025 [September 25th, 2025]
- Spending on AI Is at Epic Levels. Will It Ever Pay Off? - The Wall Street Journal - September 25th, 2025 [September 25th, 2025]
- Do You Use A.I. Chatbots for Health Advice? - The New York Times - September 25th, 2025 [September 25th, 2025]
- Practical Social Media Marketing Strategies for Wineries and Cideries Cool Climate Oenology and Viticulture Institute - Brock University - September 25th, 2025 [September 25th, 2025]
- Global ad spend to rise faster than expected amid digital boom: WARC - Marketing Dive - September 25th, 2025 [September 25th, 2025]
- Diamonds and drones: Pakistan tax unit scans social media for evasion - Reuters - September 25th, 2025 [September 25th, 2025]
- Ninth Circuit Upholds Addictive Social Media Feed Ban and Default Privacy Settings for Minors in Californias Protecting Our Kids from Social Media... - September 25th, 2025 [September 25th, 2025]
- China launches campaign to keep killjoys off the internet - BBC - September 25th, 2025 [September 25th, 2025]
- 38-Year Mining Veteran and Northern Miner's Person of the Year Joins Apollo Silver's Board - Stock Titan - September 25th, 2025 [September 25th, 2025]
- The Social Lights Transforms Agency Structure with Intelligence Focus - Marketing Communication News - September 25th, 2025 [September 25th, 2025]
- From inspiration to itineraries: Social search trends for China, Vietnam and Japan - webintravel.com - September 25th, 2025 [September 25th, 2025]
- Global ad spend upgraded to $1.17 trn in 2025 as social media platforms drive growth - Storyboard18 - September 25th, 2025 [September 25th, 2025]
- Not just 'big fish': eSafety reveals new platforms may be forced to ban teens - Australian Broadcasting Corporation - September 25th, 2025 [September 25th, 2025]
- 'Death by bullying': NSW mum wins standing ovation for speech at UN event - Australian Broadcasting Corporation - September 25th, 2025 [September 25th, 2025]
- The must-have social media tool for multi-location brands in 2026 - Search Engine Land - September 23rd, 2025 [September 23rd, 2025]
- Why 84% of Gen Z Are Trying Social Media Food Trends: And What That Means for Restaurant Marketing - fb101.com - September 23rd, 2025 [September 23rd, 2025]
- How Aldi revamped its social media strategy and why its working - Marketing Dive - September 23rd, 2025 [September 23rd, 2025]
- Influencer marketing in 2030how generative AI and creator entrepreneurship will redefine the industry - Ad Age - September 23rd, 2025 [September 23rd, 2025]
- Social Media Outlets Lose Bid to Exclude Expert Testimony on Harmful Impact - MarketScreener - September 23rd, 2025 [September 23rd, 2025]
- Prof Lilik Noor Yuliati: Nudging and Social Media Effectively Encourage Sustainable Consumption Behavior Among Gen Z - IPB University - September 23rd, 2025 [September 23rd, 2025]
- Opinion: Christianity is having a resurgence on social media - but what's driving it? - The Journal - September 23rd, 2025 [September 23rd, 2025]
- After years of trashing tech, Murdochs consider buying into TikTok USA - AFR - September 23rd, 2025 [September 23rd, 2025]
- Why Australia decided to move first and fast on social media ban - AFR - September 23rd, 2025 [September 23rd, 2025]
- Unilever's Social-First Marketing Is Fueled by AI, Real-Time Data - Consumer Goods Technology - September 19th, 2025 [September 19th, 2025]
- Strome College of Business to Host Social Media Hackathon Oct. 17-18 - Old Dominion University - September 19th, 2025 [September 19th, 2025]
- Influencers are expanding far beyond their social media channelsand brands are both a tool and a benefactor - Marketing Brew - September 19th, 2025 [September 19th, 2025]
- How Aldi revamped its social media strategy and why its working - Grocery Dive - September 19th, 2025 [September 19th, 2025]
- Reddit Is Becoming a Key Consideration for Marketers - Social Media Today - September 19th, 2025 [September 19th, 2025]
- Viral violent videos on social media are skewing young peoples sense of the world - UNSW Sydney - September 19th, 2025 [September 19th, 2025]
- Social Business Intelligence Market Anticipated to Hit USD 65.2 Billion by 2032 - openPR.com - September 19th, 2025 [September 19th, 2025]