Is my site hacked with spammy fifa coin links?

Topic summary

Issue Identified:
Shopify store owners discovered spam URLs appearing in Google Search Console, primarily related to FIFA coin sites they didn’t create. These URLs exploit search query parameters and vendor fields to generate spammy backlinks.

Root Cause:
The spam results from bots manipulating search terms and vendor query strings (e.g., /collections/vendors?q=fifa-coins). Shopify support initially characterized these as harmless “spam backlinks” from external search queries, not actual security breaches.

Community Solutions:
User Jizo_Inagaki provided code snippets to add noindex meta tags:

  • For vendor pages: Check if path is /collections/vendors with zero products
  • For search pages: Check if path is /search/ with zero results

Both solutions insert <meta name="robots" content="noindex"> in theme.liquid after the <head> tag.

Official Response:
Months later, Shopify deployed a platform-wide fix generating 404 pages for unknown vendors based on query strings, blocking Google indexing. However, the rollout is gradual.

Ongoing Concerns:
Some users report recurring spam pages despite fixes. Questions remain about whether to use Google’s disavow tool (designed for external backlinks, not on-site spam) and whether the solutions fully prevent future attacks.

Summarized with AI on November 16. AI used: claude-sonnet-4-5-20250929.

Hi Jen,

this will make it more tricky. You’d need a webdeveloper for this. The search is often blocked by robots.txt, so Google is not able to read a noindex tag, which you could add to the page. As the search is important, Shopify can’t put a 404 error on those pages.

What I’d do:

  1. Request removal of all URLs starting with https://yourpage.com/search?q= - make sure, that you don’t have indexed any search query that is important. You can do that by typing site:https://yourpage.com/search?q= into Google. This removal will expire in a few weeks/months. That’s why you’d need to do further things.
  2. When there are pages indexed and also blocked by robots.txt, delete the blocking statements in the robots.txt and start the troubleshooting. This will probably take a few days, depending on the amount of pages.
  3. After the troubleshooting is done, add the deleted statements to the robots.txt again.
1 Like

Hi Jen.

I think the following code will output noindex when there are 0 search results.
However, it is not well verified, so if you use it, please check it carefully by yourself.
If you are not sure, please consult with a Shopify partner or expert.

{% if request.path == '/search' and search.results_count == 0 %}
  
{% endif %}

Notes.
Shopiyf may add a process to return a 404, which may soon be unnecessary.

Details are on my blog.
But it’s Japanese, not English.

https://webutubutu.com/webdesign/11116

1 Like

It goes in the theme area (theme.liquid) - Line 4, after the tag.

ありがとう!

(((Thank you so much!!!))) I’m going to dig into this now and will come back to let you all know how it went. Also, I’ll Goggle Translate your blog. The other code you provided has worked like a charm, so I expect this will do the same.

Total rock star! Also, if you are available for hire, I’d work with you in an instant. You can look me up on LinkedIn if interested (Jen Degtjarewsky / Jennifer) - Jen@MediaLabOne.com

Hi Everyone - Have spoken to Shopify, who spoke with their dev team and they say this is the preferred solution. I’ll post both below for anyone having these issues:

NOTE: Both go in the theme area (theme.liquid) - Line 4, after the tag.

#1 - Solution for attack on VENDORS page:

{%- if request.path == ‘/collections/vendors’ and collection.all_products_count == 0 -%} <metaname="robots"content=“noindex”>{%- endif -%}

#2 - Solution for attack on SEARCH RESULTS PAGES that return zero results:

{% if request.path == ‘/search’ and search.results_count == 0 %}

{% endif %}

CREDIT: Both solutions were created byJizo_Inagaki who is a rock star in my humble opinion. :flexed_biceps:

1 Like

Hi JenDeg.

There was a problem with my answer and I thought it would be better to put the code together, so I created the following code.

Problems with my solutions:

  • Shopify’s robots.txt has the entire search result set to Disallow
  • Search engines cannot recognize a noindex on the search results page

Necessary Action:

  • Remove Disallow for search results from robots.txt
  • Output noindex to necessary pages

■ Caution!

I am not good at writing English, so there may be some writing errors.
Here is how to do it, but failure to do so can cause serious problems.
It is also possible that my code is incorrect.
So if you are not confident in your judgment or skills, please consult a Shopify partner or expert.

■ How to remove Disallow for search results from robots.txt

Be sure to check the following:

Create robots.txt.liquid and add two lines to the original code as follows.

# we use Shopify as our ecommerce platform
{%- comment -%}
# Caution! Please read https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt before proceeding to make changes to this file.
{% endcomment %}
{% for group in robots.default_groups %}
  {{- group.user_agent -}}

  {% for rule in group.rules %}
    {%- unless rule.directive == 'Disallow' and rule.value == '/search' -%}
      {{- rule -}}
    {%- endunless -%}
  {% endfor %}

  {%- if group.sitemap != blank -%}
    {{ group.sitemap }}
  {%- endif -%}
{% endfor %}

Code part added:

  • {%- unless rule.directive == ‘Disallow’ and rule.value == ‘/search’ -%}
  • {%- endunless -%}

■ How to output noindex to necessary pages

As an example, you can summarize the code for noindex in the head tag as follows

{%- liquid
  assign flag_noindex = false
  case request.path
    when '/search'
      assign flag_noindex = true
    when '/collections/vendors'
      if collection.all_products_count == 0
        assign flag_noindex = true
      endif
    endcase
-%}
{% if flag_noindex %}
  
{% endif %}

Operating conditions:

  • Output noindex in search results (even if there are more than 1 results)
  • Output noindex when there are 0 results in vendor

However, if the status code 404 is returned when there are 0 cases on the vendors page, the noindex for the vendors page is not very useful.
As a small possibility, it may serve as a precautionary measure, though.

Just to be sure, check for yourself that the code works as intended!

1 Like

Hi Jizo,

Have just printed this out and will dive into this in the morning as it’s after 8pm in California right now. Thank you very much for being so thoughtful and kind to help me. I appreciate it so very much. I’m learning quite a lot through working on this and it’s great to know there are solutions to foil these hackers and bots! I’ll reply after I get it implemented. ~Jen

地蔵さん、こんにちは。

これを印刷したところです。現在、カリフォルニアでは午後 8 時以降なので、午前中に詳しく説明します。私を助けてくれてとても思慮深く親切にしてくれてありがとう。とても感謝しています。私はこれに取り組むことで多くのことを学んでおり、これらのハッカーやボットを阻止するソリューションがあることを知ってうれしいです!実装したら返信します。 〜ジェン

1 Like

I also like the solution of Jizo_Inagaki. You can go to your Google Search Console if you have already used it and go to Index → Pages. There could be “Indexed, although blocked by robots.txt file” at the bottom. You can click on that, “Show details” and “Review” after you have done what Jizo_Inagaki suggested - this will make the process of deleting those links faster.

Adding the /search to robots.txt again might make you vulnerable for those kind of attacks to. On the other hand, if you don’t disallow the search in your robots.txt, it might lead to a waste of crawl budget. If you are not sure whether to do this or not, you can go to Settings → Crawling Statistics after you have fixed it to have further insights in what the Google Bot is Crawling.

What about using the removal tool to remove all links with the /collections/vendors prefix for 6 months? Would this mean your site would look ‘cleaner’ to Google?

Also, the affected site I’m dealing with did have an app installed around the time the issues began so I have deleted that.

https://search.google.com/search-console/removals

That would definitely clean the index however i’d consult your Google Analytics data before doing this to make sure that you don’t have any vendor URLs actually bringing in traffic and/or sales to your site before you scrap all of them from the index.

Unfortunately this doesn’t remove them permanently, only for a few months then they get re-crawl

ed

Hi, in this does the trick to make the pages not show again with a 404 error page:

{%- if request.path == ‘/collections/vendors’ and collection.all_products_count == 0 -%}

{% elsif request.path == ‘/404’ %}
{%- endif -%}

I only believe there is somewhere (in the Theme code) a BOT that is crawling more pages, as i see the Google Search Console get more of these spammy url pages, under not indexed (because they get blocked). If i knew where it was located, then i could delete it. And i think this because multiple people from Asia got acces to the Theme Code before (for Feiverr / Up Work gigs), so i think one of them left this nice little Crawler Bot. Allthough i have no proof for it yet, my gut feeling is saying this.

Sorry, probably a very dumb question but the colours of my “robots” and “no index” is black not red and “content” is green. Does this matter? I’m afraid this is one thing I struggle hugely with! I’m beginning to wish I hadn’t looked at my Search Console!

Have also noticed that there are now different Chinese spammy url links starting with below.

These seo spammy backlinks are now being targeted to the ‘search bar’. Does anyone has that as well / has noticed? …/search?q=… Almost all of them are blocked, but a handfull went through and is being indexed. Also have reveived from some gmail name called ‘Chris Parker’, mentioning the site has errors. But as almost none of this spam is being indexed, this will be ofcourse not searchable or vissible for the people that are searching or visiting the site. So most likely it’s the person that runs these BOTS.

https://yourdomain.com/search?q=2号站娱乐官方-✔️排名代做访问➡️liuhen.vip⬅️-彩宝app买彩票赚钱是真的吗-2号站娱乐官方-2号站娱乐官方-✔️排名代做访问➡️liuhen.vip⬅️-彩宝app买彩票赚钱是真的吗-2号站娱乐官方

Hi, Many thanks for this as it really has helped me, when I did not know where to turn. I had this problem on my website and did the suggested fix. The dodgy URL’s have dropped away over time, although I am just concerned that this hack could also be used against the search bar. As I have noticed 2 weird searches from the last week in google search console, showing as indexed but blocked by robots. Could this be possible and if so would I need to add code for this also? Elaine

Has anyone encountered this where the link is an RSS feed? Would Jizo’s solution still work in this case?

Can confirm I have the same issue. Added the code and it works for those particular links.

However, I also have once created like this:

my.domain/collections/all/diablo-4-gems-for-sale,best-store-%E2%9C%94%E2%AD%95%E2%9D%A4x4gm%C2%B7com%E2…etc.

Any thoughts on how to tackle this one - I can’t see it in the admin anywhere

However, the hack does not solve the problem, it only prevents indexing. Moreover, it is not covered if it is a request.path in another language.

@Shopify_77 : When will you solve the problem to the Vendors Hack finite sustainable?

In multi-lingual environment, the code is not working.

Here is the changed code:

{%- assign targetPath = '/collections/vendors' -%}
{%- if request.path contains targetPath and collection.all_products_count == 0 -%}
<meta name="robots" content="noindex">
{%- endif -%}

If anyone wants to dig deeper for the person responsible, I spotted this persons handle in spam links and it took me to this page. Doubt he’s the only one responsible - but maybe on hot lead. I’d do something, but have no idea where to start…

https://telemetr.io/en/channels/1989706247-kunghac/publish