Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is this campaign of spammy links to non-existent pages damaging my site?
-
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content.
Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx
A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs.
I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing.
Questions:
1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task.
2. Is 403 the best response? Would 404 be better?
3. Any other thoughts or suggestions?
Thank you for taking the time to read and consider this question.
Mark
-
thanks, Alex. You make some good points.
-
1. I don't think it will, Google has got very good at ignoring these spammy sites. Creating large disavow lists isn't technically that hard, but I don't think I would spend the time doing it seeing as you haven't seen any impact.
2. I don't think either of the response codes you're returning are appropriate.
403 for the indicates that the client doesn't have permissions and therefore it could be inferred that the file does actually exist and therefore the link is valid, which is definitely not something you would want Google to think.
While you have disavowed the links you are 302'ing, I still don't think 302 is the right response. For a start, 302 has been superceded now anyway, but 302 indicated moved temporarily. That is certainly not the case. The page doesn't exist and never has. The only reason to 302 is if you are expecting traffic from these links, but I think that also sends a bad message to Google.
I would definitely suggest 404 for both cases.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it Okay to Nofollow all External Links
So, we all "nofollow" most of the external links or all external links to hold back the page rank. Is it correct? As per Google, only non-trusty and paid links must be nofollow. Is it all same about external links and nofollow now?
White Hat / Black Hat SEO | | vtmoz0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Do I lose link juice if I have a https site and someone links to me using http instead?
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
White Hat / Black Hat SEO | | Lisa-Devins0 -
Internal Links to Ecommerce Category Pages
Hello, I read a while back, and I can't find it now, that you want to add internal links to your main category pages. Does that still apply? If so, for a small site (100 products) what is recommended? Thanks
White Hat / Black Hat SEO | | BobGW0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
White Hat / Black Hat SEO | | lauratagdigital0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0