The history of scams: Part 2

Group-IB
6 min readMar 1, 2024

In part 1 of Group-IB’s series about the history of scams, we described how disjointed thefts of social media and gaming accounts gradually turned into an organized criminal market. In part 2, we look at how this activity evolved further, giving rise to two types of scams, an entire scammer subculture with gangsta rap songs, and wars between scam groups.

Problem

The arrangement in which scammers were responsible for social engineering and phishing while organizers provided phishing links and monitored scammers’ activity via admin panels worked quite well. Scammers were remunerated for doing their part, and organizers resold stolen accounts without having to conduct phishing attacks. There was one major downside in this scheme, however: instead of stealing money directly, it involved obtaining inexpensive online accounts, which then had to be sold to gain profit. Even after creating a phishing website that imitated a bank, payment system, or online shop, organizers were not able to lure victims. Meanwhile scammers — even though they were many — were not skilled at anything but social engineering.

Dating scams

A solution was found in 2018, when threat actors started to create forum threads looking for people willing to participate in dating scams. Phishing webpages now imitated private cinemas, and threat actors searched for victims on dating apps and social media posing as women looking to date. Social media accounts and photos of women were sold at low prices on the same forums and were readily used by scammers.

The organizers’ activities did not change much: now, instead of reselling stolen accounts, they had to quickly cash out stolen money or turn it into cryptocurrency. The rest remained the same: scammers received phishing links and acted as they saw fit. All participants started taking more care of their own security: responsibility for fraud and thefts is far greater than for stealing online accounts.

Dating scams led to changes in underground forums and existing groups, finally introducing a profitable modus operandi. One theft brought about $100. The market for related services (e.g., provision of accounts, proxies, and virtual cards) started to grow, as did demand for software engineers capable of creating new phishing websites (whose services it was now possible to pay for).

During that time, terminology describing the roles in scam groups took shape: an organizer was called a “TS” (short for “topic starter”), scammers were “workers”, a fraud group as a whole was a “team”, and work chats were “confs”. A “support” role also emerged, which involved helping the organizer with administrative and technical tasks.

Skilled workers generated a lot of profit for organizers but were difficult to find. This shortage led various groups to compete for workers using a common strategy: paying more. Big teams could afford to pay workers 70–80% of stolen sums even in cases when the money was blocked after the victims’ complaints. Naturally, this attracted more people wanting to make money.

By 2019, numerous established groups involved in dating scams had been present on underground forums. Members of the groups understood that they were clearly committing fraud, but there were only a few arrests. The tools they used for security (VPNs and cryptocurrencies) proved quite effective.

Group organizers were still quite limited in the ways they could make money, relying only on social engineering, but they were now prepared to quickly adapt to any market. Putting together a group did not require much effort: buying phishing website templates, renting a server and a few domain names, and creating a forum thread was all it took. There were many scammers willing to participate, both newcomers and ex-members of existing groups.

Classifieds

The next step in the evolution of scams was a scheme called Classiscam, which involved classifieds platforms. This was a major leap forward for threat actors. Instead of static template-based phishing websites, they now used scripts that created web pages based on ads posted on online marketplaces. Fraudsters had to carefully extract and transfer photos, descriptions and prices of advertised goods and recreate the functionality of the interface to trick victims into believing that they were using the original platform. These efforts paid off many times.

Phishing imitating Steam targeted niche gaming communities and dating scams affected dating app users; now however, fraudsters’ opportunities grew ten- if not hundredfold. Each classifieds website covered numerous markets for various goods in many cities. A phishing page template worked for any ad, from a 50-dollar pair of boots to a 500-dollar smartphone. This is when threat group members started calling themselves scammers.

Scam groups grew quickly in size and numbers. Workers were only limited by the availability of SIM cards needed to register on marketplaces and talk to victims. The problem was solved by services that rented out phone numbers. The rest — photos of goods and ad templates — was easily available on the marketplaces themselves.

Since raising workers’ share to more than 80–90% was not viable, groups started improving working conditions to attract more scammers. For instance, threat actors began using Telegram bots, which replaced admin panels. Now, everything necessary for a worker’s job was available in a single messaging app. After creating an ad, the worker sent a link to the bot and immediately received a finished phishing page. After successfully tricking someone, the worker received a notification from the bot showing their share, which they could immediately request from the organizer.

The scheme was simple, effective, and transparent for everybody. It also required more and more people. The competition for workers started manifesting itself not only in market mechanisms, but also in other activities: organizers hired designers to style their forum threads, conducted competitions among workers, and ordered custom gangsta rap songs.

Restructuring

Over time, organizers faced a growth problem. Some teams had hundreds of workers who despite the automation of thefts required a lot of administrative efforts. Organizers were bogged down in replacing blocked domains, dealing with payouts, handling minor conflicts, and conducting consultations. Some groups solved this problem by creating administrative departments that comprised supporting specialists who took on routine tasks.

Other groups restructured their activity into affiliate programs, in which several organizers — each managing a separate team — operated under a single head organizer. Every team had their own forum thread, chat, Telegram bots, and phishing domains. All teams used the same server with a script that generated phishing webpages.

In affiliate programs, successful workers became organizers. They knew the inner workings well and could build their own teams and onboard newcomers. The share of the workers-turned-organizers was not big: after paying 80% to the worker and 10–15% to the affiliate program owner, there was not much left. However, they made profit from every single theft committed by the team.

The composition of affiliate programs was not announced, but it was evident from phishing domain and server infrastructure common for several groups, as well as from messages on forums. Affiliates commented on each other’s threads and profiles and recommended new organizers as “reliable and trustworthy”.

Phishing mechanisms gradually evolved: feedback forms were integrated and the new role of a “returner” emerged, under which threat actors posed as marketplace tech support employees and tried to swindle victims out of money a second time by saying that there was a problem with the purchased item and that the victim had to apply for a “refund” (in reality, the same amount was taken from the victim’s card a second time).

Further evolution

In spring 2020, a new scheme emerged, which targeted sellers rather than buyers. It spread from classifieds to other popular online platforms, such as a travel agency and a transportation service.

The market for related services grew and included document and photo forgery, automated ad parsers, bulletproof hosting, and online shops for selling accounts. Many scripts for generating phishing links emerged, which could be not only purchased, but also rented. An organizer new to the job could now avoid dealing with hosting services or installing scripts — all they had to do was to specify their domain and the card for receiving stolen funds in a dedicated Telegram bot.

Concurrently, a subculture developed around this activity. Groups created many niche memes ridiculing victims (who are called “mammoths” in scammer slang) and sneering at failed fraud attempts and arrested threat actors. The main platform for such content was Telegram. Dozens of scam groups created their own media infrastructure in which threat actors advertised services, recruited workers, had clashes, and deanonymized each other.

Scammers did not stop there and moved on to expand globally, which we will talk about in upcoming part 3 of this series.

--

--

Group-IB

Group-IB is a leading creator of cybersecurity technologies to investigate, prevent, and fight digital crime