Dating App Bots: A New Dystopia

Worldwide restrictions on movement and overall lack of social contact have prompted many to flock to dating apps.

This article includes a backstory – click here to go straight to the body.

Image source: New York Times
The Unoriginal Opportunist

A couple of weeks ago I received a match on Tinder: within a minute of matching, she sent me a username and the words “add me”. No mention of the platform in question, no greeting, no manners – just a directive.

“Are you a bot?” I asked her.
“No,” she replied.
“Prove it,” I responded.
“Don’t need to be so-“

But the response was incomplete. We both expected a final word, but there wasn’t one to be found.

I looked for the unmatch button, navigating through the refreshed UI (why is it a shield icon, anyway?) but didn’t need to.

To our mutual benefit, she could unmatch faster than she could formulate a witty response.

Catfish casting fishing nets

It’s not exactly new. Women have been using Tinder to build their Instagram following since 2014 – a full year before integration between the two apps was added in 2015. Once a match has become an Instragram follower, the woman will ghost or promptly unmatch.

Such is the current situation: desperate and unwitting dating app users, converted into numbers. Numbers that feed a growing, artificial metric of individual value for a sea of talentless, aspiring ‘influencers’ (punctuation barely contains my disdain).

Back when I worked in the Insurance sector (in Hong Kong), saleswomen were known to use Tinder and WeChat’s proximity feature to expand their pool of potential clients. This is an ongoing practice, which isn’t likely to stop anytime soon (much like whaling in Japan).

Men who bit the bait would realise only too late – hustled by an attractive young catfish discussing insurance premiums and not a potential future together.

Misdirection will always be a tool at the forefront of sales and advertising: mislead the gullible, obtain currency, discard. Rinse and repeat.

There’s a reason why advertising executives widely seen as the least trusted profession. (The 2014 article about converting Tinder matches to Instagram followers? Published under the Advertising category.)


Dating App Bots: How they’re used today

However, this interaction sparked my curiosity in a new direction: what capacity are bots now being used for on dating apps? With the prevalence of Artificial Intelligence, are we seeing more sophisticated bots used in this space (presumably for the same ends)?

Easier than you think

Writing chatbot scripts are easier and more accessible than ever before. The increased use of AI also means that chatbots are increasingly trained to ‘think like humans’, and able to more accurately simulate human behaviours.

What are dating app bots used for, anyway?
  1. Romance Scams
  2. ‘Fembot’ Armies
  3. Phishing, Adware & Malware
  4. Automating “The Game”

1. Romance Scams

Let’s begin with the macro view: online romance scams. Incredibly lucrative, the practice shows no signs of slowing down. The AAAC reports that these scams cost Australians more than $28.6 million in 2019, with an average loss of more than $19,000.

Image credit: dimitrisvetsikas1969 (Pixabay)
For Your Money

In the past scammers have commonly acted in groups, creating fake profiles for coordinated ‘around-the-clock’ operations, taking turns to message the victims from the same fake profile at any time of the day.

These days, it’s far more likely that scam rings are leveraging bots (partially, if not mostly) during the process. Less manpower and less people involved means a larger cut of earnings.

Complicated stories are woven to explain why they can’t meet in person, and that the victim needs to send money for them to buy a plane ticket to visit. In some cases, they even pressure the victim to open a bank account for them, filled with funds.

A more common practice in the last few years is to request gift cards, which are much harder to trace.

For Your Data

Phishing: if they can’t obtain currency from you directly, they can still extract Personally Identifiable Information. This can either be sold on the dark web or utilised for identity theft (more on phishing later).

For Laundering Money

A late 2020 report revealed an Australian victim used for money laundering. The ACC (Association of Corporate Counsel) warned that victims can face jail time. While the complexity of this operation points to human orchestration and not a scripted bot’s doing, these are the workings of organised crime – and not the average small-time scammer.

On a related note, (bot-assisted) ad fraud has been one of organised crime’s key earners for years. As bots become increasingly complex (and accessible), it’s well within the realm of possibility for bots to play a role in money laundering.


2. ‘Fembot’ Armies

Image source: Gizmodo (‘How Ashley Madison Hid Its Fembot Con From Users And Investigators’)

Extramarital affair website Ashley Madison is no stranger to accusations of filling their website with fake female profiles. While these allegations were repeatedly denied by the company, they eventually fell victim to a fateful massive cyber-attack in July 2015 by a group known as “The Impact Team”.

The Impact Team’s demands were clear. Shut down Ashley Madison and sister site “Established Men”, or face the consequences. They didn’t budge.

Wrong move: eventually, full database details (over 60GB in size) were released on the dark web later in mid-August across two leaks, distributed via BitTorrent.

Image taken from Newitz’s Gizmodo article ‘Ashley Madison Code Shows More Women, and More Bots’

The leaked data culminated in a report which revealed that more than 70,000 female bots were created to send male users millions of fake messages.

Existing to feed an ‘illusion of a vast playland of available women’, these bots would chat with men and encourage buying more credits to continue conversation:

…20 million men out of 31 million received bot mail, and about 11 million of them were chatted up by an automated “engager”…these robo-encounters could come roughly every few minutes.

Annalee Newitz, ‘Ashley Madison Code Shows More Women, and More Bots’ (Gizmodo)

Verbatim examples of the bot scripts in the article seem anything but sophisticated – but sophisticated enough to fool the Turing Test, according to Claire Brownell.

The second leak revealed email exchanges evidencing Ashley Madison employees paying companies to create fake women’s profiles.

Closer inspection revealed a special bot service titled “RunChatBotXmppGuarentee.service.php” would run for customers who paid $250 for a “guaranteed affair”:

…this bot would chat up the man, urge him to pay credits, and then pass him along to what’s called an “affiliate.” Likely the affiliate is a third party that provides a real person for the man to chat with. It might also be connecting him to an escort service.

Annalee Newitz, ‘Ashley Madison Code Shows More Women, and More Bots’ (Gizmodo)

The ‘fembot army’ is still alive and well: able to speak 31 languages, and servicing 53 countries.

While Ashley Madison’s IPO was delayed in 2015 as a result of the breach, the company still operates seemingly without consequence. Same goes for sister sites EstablishedMen and Cougarlife, all under the rebranded “ruby” parent company.

Map taken from Newitz’s Gizmodo article, via Jake Perkowski
3. Phishing, Adware & Malware

Alicia, Haley and Cherry. No, these aren’t names of escorts.

They’re names of bots identified on Tinder by security firm BitDefender back in 2014. These bots would engage in chats with users before sending dubious links sporting a ‘tinderverified.com’ domain.

UK users would be sent to fraudulent surveys (phishing) and competitions for big brands (Asda and Tesco), while US users would be brought to a download page (adware) for mobile game Castle Clash.

Survey links are particularly effective because they require even less work: if you’re volunteering your name, email, address, DOB and card details, the hard work is already done.

Image credit: Lifewire (‘Could Your Tinder Match Be a Scam Bot?’)

You’d imagine that most men would be skeptical of links send by attractive women – but according to research by Inbar Raz of PerimeterX, the click-through rate is an astounding 70%.

While a “.com” domain may create a sense of legitimacy, it’s all too easy to buy a domain for the purpose of creating links to redirect to a phishing page. These phishing pages are then used to steal passwords, credit card details and other personal information by intercepting browser cookies.

Malware can also be injected by way of links. Difficult for humans to identify, these links usually require running a remote site scanning tool before malware threats or infected files can be identified. Otherwise, it’s down to how up-to-date your antivirus definitions are.

4. Automating “The Game”

Programmers are now gamifying the online dating experience: automating not only swiping based on your preferences, but also sliding into DMs and conversing on your behalf.

Robert Winters is one such opportunistic programmer, showing us just how easy it is to customise a bot to find your next potential relationship.

Image Credit: Mashable.com

He did this by using a program called Tinderbox (later named Bernie A.I.). Bernie the bot is just one of many, available open-source and completely free on Github.

But before Winters there was Jeffery Li – a data scientist at DoorDash, who began his Tinder Automation project with a specific goal:

“The seed of it came from saying ‘Hey, I want to improve my dating life, however, how can I do that in the most lazy way possible?’”

Yes, gamifying romance is officially a thing. And yes, nerds absolutely have the upper hand…at least where automation is concerned.

Li first began with analysing the way Tinder’s algorithm worked. In order for the AI to build a profile of his dating preferences, he needed to swipe right on more profiles.

As he didn’t swipe right often, there was a shortage of data. He supplemented this by feeding images of women he thought were attractive (taken from Google), so that the model could learn his preferences.

The algorithm quickly became pickier than its creator:

“It would actually reject some of the some of the profiles that I actually thought were were okay,”

Really, a 32% probability of disliking Scarlet Johansson? Bad bot!
Image Source: Jeffery Li

The second step was to write an automated message for the bot, one that he could alter for each individual match. The bot performed the screening process for Li, set to swipe 100 times per day – estimating that he liked 20 of them.

In the end, however, Li found no success. Attributing his poor match rate to not having “a good profile” (he never met anyone serious using the bot), he eventually stopped using it.

Winters, on the other hand, took it a step further: programming the bot to do the talking for him. By mapping conversation trees, he created charts of dialogue that would branch out in different directions, depending on the responses of the match.

“At one point, the bot was having maybe 200 conversations at a time…I think Tinder knew this and they banned me, of course, from the platform.”

Unfortunately his attempts would result in him being banned on Tinder. Developers are rarely pleased with the introduction of third-party software.


Clearly the Age of the Nerd is only just beginning; and a background in programming may actually net you more matches than approaching women IRL. As an IT postgrad I’m still not quite sure how I feel about this.

Published by Tech Neck Nick

I'm a cybersecurity major postgrad student from Sydney, Australia. Support my fight against Writer's Block.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: