Google reconsideration request processed - but same story.
-
We have been getting the same response from Google after several reconsideration requests.
THE SITUATION:
Our site displays 3 distinct product lines. We separated each product line by use of SUB-DOMAINS. All 3 product lines are integrated as part of the main NAVBAR. The product list pages run off the sub-domain; however, product detail pages run off the MAIN-DOMAIN.GWT
Google has taken manual action because the MAIN-DOMAIN.COM links to PRODUCT-A.DOMAIN.COM on every single page. I attempted several times to explain; but without success. It's only one SUB-DOMAIN causing a problem. The other 2 SUB-DOMAINS are setup the exact same what without issue. This week, we simply added a NO-FOLLOW on the link to the SUB-DOMAIN causing the issue; we will see if this helps.Anyone else ever experience this?
-
Any chance you can screenshot and redact the domain - I am not sure that this is being read right...
-
MAIN-DOMAIN.COM account in GWT
- Within the MANUAL ACTION option; there is a MORE LINK under the "Affects, Some incoming links"; upon click, popup displays message :"Affected URL matches, Pointing to: PRODUCT-A.DOMAIN.COM
-
"Google has taken manual action because the MAIN-DOMAIN.COM links to PRODUCT-A.DOMAIN.COM on every single page."
How do you know that this is the problem? Google is not usually that specific when they give out penalties. Can you copy and paste the message you are getting from Google and also what you see in your manual actions viewer (Search Traffic --> Manual Actions)?
-
Hi there,
I haven't seen this personally - would you be able to include (or PM to me) the email from the web spam team saying that it is the subdomain that is causing the problem?
Nofollowing the link could do the trick as nofollow is meant to make Google essentially ignore the link, but I am curious about their rationale behind a manual penalty due to linking to your own subdomain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Google Custom Searches with site CSS
Anyone good with GCS. I want to add Google custom searches in my site but with my site CSS.
Intermediate & Advanced SEO | | csfarnsworth
I need results from GCS but want to display with my website CSS. Website is in OSCommerce and php.0 -
Why, oh why does Google hate us?
My URL is:
Intermediate & Advanced SEO | | candylotus
www.drupalgeeks.org I have tried my very best to cover all the usual SEO items.... But nothing... We are a legitimate company offering a legitimate service. Ideally we would come up in the results for: "Drupal Developers"
"Drupal Development" and
"Drupal Designers" Yet, we cannot break the top 50 for any of these.... We have:
Optimized Meta Tags
Written Quality Content
Maintained a Social Presence in Twitter, LinkedIn, Facebook, Pinterest, Youtube, and Google Plus
Blogged with some consistency
Guest Blogged
Canonicalization
And dozens of other things And we are all around nice people and a good company....... Why oh why? Can anyone take a look and see if there is something blatantly obvious I am missing? Are we using "drupal" too much?? Thank you in advanced for any assistance. Candice1 -
Google Places Multiple Location
Hi everyone, I have a client with multiple locations in the same city. I would like to have their Goolge places listing show up under the main website listing. Currently, one of the Google places listings in being pulled in directly below the main website but not the other. The Zagat rating is being pulled in as well. I would like to have both locations show up when you type in the name of the business. Any ideas how to do this?
Intermediate & Advanced SEO | | SixTwoInteractive0 -
Google Penguin Winners and Losers?
Hi All, Just wondering out of the SEOmoz community who has come out on top after Penguin and who has been hit and why. Personally my site has come out on top. I started workig on the site back in December and NOTHING had been done, no link development, no onpage, nothing, a virginal website. The site was chock-a-block with issues both technically and in content. After 4 months of hard work, we have climbed from 100+ to top ten on most of our phrases and post Penguin we have climbed even higher as some of our competitors were dragged down into the murky depths. So I think thats a win (for now). My focus has been on Guest posting, social outreach, reviews and getting my on page right (still a ways to go, but our CMS is clunky to say the least). A little humour attached 😉 (Why has no one yet stuck Matt Cutts head on a Penguin?) Are you a Penguin Winner or have you experienced the wrath of the penguin. keep-calm-and-deoptimise.jpg
Intermediate & Advanced SEO | | Aran_Smithson0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Have we suffered a Google penalty?
Hello, In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales. We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words. Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google. The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added. In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site. We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc. Does anyone have any feedback or suggestions as to how we can get back on track? Regards,
Intermediate & Advanced SEO | | ukss1984
David0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0