Google Sitemaps & punishment for bad URLS?
-
Hoping y'all have some input here. This is along story, but I'll boil it down:
Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X.
Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly.
Thoughts?
-
If it was only 2 days, quickly fix the sitemap and re submit it. I do not think Google will punish your sites rankings that quickly.
I would also perform a indepth search of links to the new site and see if maybe the site had some bad links and were going to get hit with a penalty for that.
You can also view this post for the 301 redirect
http://www.seomoz.org/q/can-penalties-be-passed-via-301-redirects
-
How long has the site been non-existent? By non-existent do you mean none of the pages are in Google's index or that they are simply not ranking as high as they used to? If site X has been de-indexed, try resubmitting a corrected sitemap. It might take a little bit for Google to crawl and re-index, but I'd think then it would be fine. If you truly have been removed from Google's index, fix the sitemap and file a reconsideration request. I would think Google would recognize this as an honest mistake.
Good luck. I hope that helps a little.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there proof that disavowing backlinks in GSC help to boost rankings in Google?
Hi Guys Let's say you have a website and you got some questionable back links or lower quality ones. Does anyone have proof that after disavowing back links helped in the rankings or had some positive effects? I am concerned that Google will place our website on their radar and instead possibly demote it or smth. Lastly, if disavowing is the way to go what criteria do you use to disavow backlinks? So if you get questionable back links over time, should you disavow ongoing as well? If so how often? Cheers John
White Hat / Black Hat SEO | | whiteboardwiz0 -
Massive Google Search Spam
We have come to know that one of competitors of our client is spamming Google search results on massive scale. If we search with keywords like "iphone spy apps" , "text messages spy " etc then most of results from 3rd or 4th page onwards show totally irrelevant sites but when we click on those results/pages then all redirect to either http://topspysoft.com/ OR http://www.mspy.com/ . They have been doing it on massive scale for last few months against hundreds of queries and populating hundreds of search results. If use some country specific Google site then again hundreds of results come from totally irrelevant country specific domains (au,nz,uk etc) and they all redirect to topspysoft.com or mspy.com. Can you please tell how they are doing it and how they are able to do it on such a massive scale without getting noticed by Google ? Is there any way to report this issue to Google as the current only allows one link ? Following are some of the spam urls to give you an idea www.crcincva.com/doc/20-best-iphone-spy-apps/
White Hat / Black Hat SEO | | shaz_lhr
chefitupkids.com/top-10-spy-apps-for-iphone/
jarestaurant.com/text-spying-apps-iphone/
www.lisamishler.com/qn/phone-spy-apps-uk
tigerdenus.com/spy-apps-for-iphone-no-jailbreak
palmhousestl.org/templates/phone-location/iphone-spy-apps-uk.html I'm also attaching couple of images which show that almost 80% of results on those pages are actually spam pages WlpJshL qtuLdHp0 -
How to transform an excel file on a txt file to send the Google Dissavow
I have a disallow file made on excel with lots of columns off information. I want to transform to txt file saving it from excel, but the result file seems understandable Can someone helpme on how to transform an excel file on the Google Dissavow file format for the final import
White Hat / Black Hat SEO | | maestrosonrisas0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
Need advice on best strategy for removing these bad links.
Heres the scenario... We recently took on a new client who's previous seo company had partaken in some dodgy link building tactics. They appear to have done some blog comment spam, very poorly. The situation we are now in is this: We have a site with an internal page deemed more important than the homepage (the homepage has 60 linking root domains and the internal page 879). It looks as though the previous seo company submitted a disavow request, theres a message in webmaster tools from a few weeks back saying it had been received, but no further correspondence. I have doubts as to whether this disavow request was done correctly... Plus im not sure that Google has issued the site a warning yet as they are ranking position one for the keyword on the internal page. Our clients want us to handle this in the correct manner, whether it be to simply ignore it and wait for Google to send a warning about the links, remove the offending internal page and leave a 404, or try to disavow the links that google doesnt know about yet from 800+ websites. Suggestions for the best practice for dealing with this situation? Any advice is much appreciated, Thanks, Hayley.
White Hat / Black Hat SEO | | Silkstream0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Do bad links "hurt" your ranking or just not add any value
Do bad links "hurt" your ranking or just not add any value. By this I mean, if you do have links from link farms and bad neighbourhoods, would it effectively pull you down in search engine rankings. Or is it more that it's just a waste of time to get these links, as it adds no value to your ranking. Are google saying avoid them because it will not have a positive effect, or avoid them becuase it will have a negative effect. I am under the opinion that it will not harm, but it will not help either. I think this because at the end of the day you are not 100% in control of your inbound links, any bad site could add you and if a competitor, god forbid, wanted to play some black hat games, couldn't they just add you to thousands of bad sites to pull your ranking down? Interested to hear your opinions on the matter, or any "facts" if they are out there.
White Hat / Black Hat SEO | | esendex0 -
Do backlinks with good anchor text from bad sites help?
Hi, In the Netherlands, the SEO competition for terms like loans is very competitive. I see a website in this industry that seems to be doing very well based on links with good anchor text from sites that seem quite worthless to me, such as: http://www.online-colleges-helper.com/ and http://www.alohapath.com/ My question is: is it worth pursuing this type of links? I assume these must be paid links, or am I wrong? I'd really rather not go down this route but I don't want to be outranked by someone who is using these types of links... Many thanks in advance for any type of insight! Annemieke
White Hat / Black Hat SEO | | AnnemiekevH0