Disavow both www. and non www. version of site?
-
I just submitted my disavow file to Google after several months of work. A few months ago a saw a partial match unnatural link penalty in the www. version of my site's Search Console account. The penalty has since expired.
Should I also upload the file to the non www. side of the Search Console account? No penalty ever appeared there.
-
Hi there.
Quote from GMT:
Note: When looking at the links to your site in Search Console, you may want to verify both the www and the non-www version of your domain in your Search Console account. To Google, these are entirely different sites. Take a look at the data for both sites. More information
I'd do it for both versions, just to be safe, even if you have proper redirects from one to another. If you don't have proper redirects - gotta do it right meow!
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site is being deindexed for unknown reason
A few days ago I noticed that my site gusty.se was not showing up in google, only the subpages. There is no message in the google search console. I requested the site to be reindexed and about a day later the site was showing up in google again. Now another day has past and the site is now again not indexed in google. Question is why the site is being deindexed??? I have worked a bit with getting backlinks to the site and I did recently gain 3 backlinks within a few days (about a week has past since I gained these links). Still I can't believe Google would count this as unnatural link building, especially since I guess it will take some time for Google to detect new incoming links. Another thing I've notice though is that my site about two weeks ago got a high number of incoming links from different spam sites with .gq TLD's (see the attached screenshot). The majority of these sites have however not linked to my main page but to a sub page which still is indexed by Google. Can all these spamlinks be the reason to why Google has deindexed the main page of my site? I've read that Google in general ignore links from spam sites, still I have taken action against these spam sites by submitting a disavow text file containing all these spam domains. I submitted this file about 2 days ago. I have now again requested the site to be reindexed so perhaps will it soon be listed again. Still, I can't keep having my site deindexed and having me reindexing it every second day. I would really appreciate if someone could give me some insight in this problem. moz.jpg
Intermediate & Advanced SEO | | Grodan21 -
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Google cache is for a 3rd parties site for HTTP version and correct for HTTPS
If I search Google for my cache I get the following: cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page) cache:https://www.saucydates.com -> Displays the correct page Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably. ** I have tried many things, here is the current list:** If I fetch as Google in webmaster tools the HTTPS fetch and render is correct. If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect). If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct. The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x The SSL cert is valid and with COMODO I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL I have checked my site in Waybackwhen for 1 year and there are no hacked redirects I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page. If you check the cache for the https version of the netball site you get another sites default plesk page. This happened at the same time I implemented SSL Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content) ** Ideas:** Is this a 302 redirect hi-jack? Is this a Google bug? Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!) A network of 3 sites mixed up that have plesk could be a clue? Over to the experts at MOZ, can you help? Thanks, David
Intermediate & Advanced SEO | | dmcubed0 -
Plural versions of keywords
I am a Magician and optimising my site for local seo mainly. I would imagine this will affect any local business but I woule like some input from the community on how to handle variations of the same keyword. e.g I am a Magician and I am based in Hampshire So I have created a page for Hampshire and from previous keyword research optimised it of Magician Hampshire. However there are many more variations of the keyword. e.g Magicians Hampshire, Magicians in Hampshire, Hampshire Magician, Magician in Hampshire, Close up Magicians in Hampshire, Hampshire Magicians, Magicians near Hampshire, Local Hampshire Magicians. The list goes on but I can not find an answer on wether I should have a separate page/blog post for each of these keywords or include them in the same page. If I include them in the same page how to not make it spammy I am sure this is the same for electricians, dentists, hairdressers etc so am hoping there is a common answer to a problem that has been puzzling me for a long time Many Thanks Roger
Intermediate & Advanced SEO | | rnperki
http://www.rogerlapin.co.uk/0 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0 -
Site structure from an SEO standpoint
I am fortunate enough to be working with a client who is still building their website. From a site structure standpoint, what can I look for with my SEO hat as they build their wire frames and storyboard their site? I want to make sure I don't miss any components that might be helpful short and long term
Intermediate & Advanced SEO | | StreetwiseReports0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0 -
Site structure question
Hello Everyone, I have a question regarding site structure and I would like to mastermind it with everyone. So I am optimizing a website for a Ford Dealership in Boston, MA. The way the site architecture is set up is as follows: Home >>>> New Inventory >>> Inventory Page (with search refinement choices) After you refine your search (lets say we choose a Ford F150 in white) it shows a page with images, price information and specs. (Nothing the bots or users can sink their teeth into) My thoughts are to create category pages for each Ford model with awesome written content and THEN link to the inventory pages. So it would look like this: Home >>> New Inventory >>> Ford 150 Awesome Category Page>>>>Ford F150 Inventory Page I would work hard at getting these category pages to rank for the vehicle for our GEO targeted locations. Here is my questions: Would you be annoyed to first land on a category page with lots of written text, reviews images and videos first and then link off to the inventory page. Or would you prefer to go right from the new inventory page to the actual inventory page and start looking for vehicles? Thanks you so much, Bill
Intermediate & Advanced SEO | | wparlaman0