Spider 404 errors linked to purchased domain
-
Hi,
My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports.
Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar).
From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc.
The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped.
Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors?
Thanks
-
Hard to say, I doubt you would be penalized for only incoming links if your site is clean. But you may always have that history. You could add the site to webmaster tools and submit a reinclusion request explaining the situation. They maybe be able to "wipe the slate clean" for you.
-
Hi Steve, Many thanks for replying.
I checked the acquired domain in OSE and there are 302 redirects from an online pharmacy (that according to the wayback machine existed back in 2009) to pharmacy pages on the acquired domain. In turn OSE shows the linking site has a single link from a dodgy looking russian site called picto.ru. So, I guess the spiders are simply following the links.
Assuming my client wants to use the acquired domain will they get penalised for the existing spammy links?
-
The purchased domain used to have those pages, and when you redirected it, the engines were looking for them on your domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strange 404 Pages Appearing in Google Analytics
Hi, My client has some strange urls appearing in GA which lead to error pages. Please see the following image: https://imgur.com/a/6TPO8yL e.g URLs like /h/6445738.html I've used screaming frog to see if these pages exist on the website and I can't find them, anywhere. Therefore how are they coming up in GA? If anyone could please help I'd really appreciate it.
Reporting & Analytics | | SolveWebMedia0 -
Get anchor text, nofollow info etc from a list of links
Hi everybody. I'm currently doing a backlink audit for a client and I've hit a small problem. I'm combining data from Ahrefs, OSE, Webmaster Tools and Link Detox. I've got around 27k links in total now, but the issue is that WMT does not provide data on target page, anchor text and nofollow/dofollow. This means I have around 1k links with only partial information. Does anyone know of a way that I can get this data automatically? Thanks!
Reporting & Analytics | | Blink-SEO1 -
Why does a selection of sites I have written guest posts on not come up on my link analysis?
I have done a few guests posts on different sites and they are not coming up in my link analysis report.
Reporting & Analytics | | meteorelectrical
We created an info graphic on one particular site and this site isn't coming up on the link analysis report. Would there be a reason for this. I ran a check on the sites code and it doesnt contain "nofollow" as i originally thought this was the problem. Here is an example of our work on a site that isn't coming up on the analysis report. http://www.electriciansblog.co.uk/2013/10/energy-saving-using-led-lighting/ Thanks0 -
Accidental Link not being removed by Google WMT
I operate two sites for a client. One is a local business and one is their national business. I used the same template for both sites (with changes) but accidentally left a link in the footer to the local site. Now the local site is showing 12k backlinks from the national site. I removed the link over 2 weeks ago but it still shows up in Google WMT in the "Links to your Site" section. It goes to a coupon page and not a "targeted" page but 12k links to the local site is 6 TIMES what I had before. My question is: "Is there a way to get Google to remove the link from Google WMT?" More specifically force it. Like I said the link has been removed for over 2 weeks but it still shows up in the Local site's Incoming Links section of WMT. Thanks.
Reporting & Analytics | | DarinPirkey0 -
Google Analytics Site Search to new sub-domain
Hi Mozzers, I'm setting up Google's Site Search on a website. However this isn't for search terms, this will be for people filling in a form and using the POST action to land on a results page. This is similar to what is outlined at http://support.google.com/analytics/bin/answer.py?hl=en&answer=1012264 ('<a class="zippy zippy-collapse">Setting Up Site Search for POST-Based Search Engines').</a> However my approach is different as my results appear on a sub-domain of the top level domain. Eg.. user is on www.domain.com/page.php user fills in form submits user gets taken to results.domain.com/results.php The issue is with the suggested code provided by Google as copied below.. Firstly, I don't use query strings on my results page so I would have to create an artificial page which shouldn't be a problem. But what I don't know is how the tracking will work across a sub-domain without the _gaq.push(['_setDomainName', '.domain.com']); code. Can this be added in? Can I also add Custom Variables? Does anyone have experience of using Site Search across a sub-domain perhaps to track quote form values? Many thanks!
Reporting & Analytics | | panini0 -
Count of links
I am using free link API to get total number of internal links,
Reporting & Analytics | | Ravi_Pathak
external links, follow links and no follow links by using *
http://lsapi.seomoz.com/linkscape/links/* url. I have given following attributes along with the url as, SourceCols=4 TargetCols=4 Scope=page_to_page Sort=page_authority Limit=1000 Filter=nofollow By implementing this, I am getting array for each filter and by counting
array size, I can get the count of total links. This is a long procedure and seomoz link API is taking a long response time. Is there any way by which I can get the total number of link for each
filter directly ? Or is there any other alternative ?0 -
Phantom urls causing 404
I have a very strange problem. When I run SEOmoz diagnostics on my site, it reveals urls that I never created. It seems to combine two slugs into a new url. For example, I have created the pages http://www.naplesrealestatestars.com/abaco-bay-condos-naples/ and http://www.naplesrealestatestars.com/beachwalk-naples-florida/ and now the url http://www.naplesrealestatestars.com/abaco-bay-condos-naples/beachwalk-naples-florida/ exists in addition to the two I created. There are over 100 of these phantom urls and they all show a 404 error when clicked on or crawled by SEOmoz. Any body know how to correct this?
Reporting & Analytics | | DanBoyle760 -
Tracking Effects of Internal Linking
Can anyone suggest the best way to track the effectiveness of internal linking on a website? Thanks!
Reporting & Analytics | | RishadShaikh590