Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Switch to www from non www preference negatively hit # pages indexed
-
I have a client whose site did not use the www preference but rather the non www form of the url. We were having trouble seeing some high quality inlinks and I wondered if the redirect to the non www site from the links was making it hard for us to track.
After some reading, it seemed we should be using the www version for better SEO anyway so I made a change on Monday but had a major hit to the number of pages being indexed by Thursday. Freaking me out mildly.
What are people's thoughts? I think I should roll back the www change asap - or am I jumping the gun?
-
I agree 100% with Dan
You should essentially use all three big tools you can most likely find out using just two what the majority of the links point to.
Here is a great reason as to why you should care
http://blog.hubspot.com/blog/tabid/6307/bid/7430/What-is-a-301-Redirect-and-Why-Should-You-Care.aspx
http://www.opensiteexplorer.org/
with one or both ( if it were my site I would want to see all the links pointing to it and how powerful they are so I would purchase one month of services from each or only only one the two below in addition to Moz open site Explorer simply because none of them have the entire link index)
If they point to the www.version of your domain then 301 redirect and remember to add the www.example.com and non-www- http://example.com
Using a 301 redirect discussed thoroughly in this link
http://moz.com/learn/seo/redirection
&
this great guide
Then tell google you choice in Web Master tools
when you have found out which one has the most powerful relevant links pointing to it add both www. & no-www to Google webmaster tools and you can then select which one Google will index.
https://support.google.com/webmasters/answer/44231
If its to close to call use marketing.grader.com to find out which one has more likes tweets and especially plus ones from Google because 301 redirects do not pass on social sharing you can use this as a tiebreaker.
Sincerely
Thomas
-
Hi Brigitte
To echo some of the other answer here, simply having www vs. non-www does not affect rankings at all directly. What matters is choosing one and keeping it consistent. This would mean consistent across;
- Internal links
- Always redirect from the non-preferred to the preferred
- Don't switch if you don't have to
- Try to get back links pointing at the preferred version
By the way, you need to register a separate google webmaster tools account for non-www (it is treated as a different website in terms of some of the data).
I would choose the version with the most backlinks pointing at it, honestly, and then keep it that way forever.
-Dan
-
First off if you are doing this just to assume that you will get more links because people type www.by default into a lot of things I would really not change it for that reason. It only reason I would change it is if you are going to introduce some sort of software like google page speed which needs a subdomain. Regardless first make sure that you have actually done at 301 redirect use this tool put in your URL
http://www.internetofficer.com/seo-tool/redirect-check/
I would do return the site to how it was Unless you have good reason to believe that you actually acquire more links this way. Or you have more www. links pointing at your site.
I do not believe that it is the end of the world by any means, but I do not think that if you are having problems receiving links you are going to solve anything by adding at www.
You need to work on various white hat methods of gaining links.
Not changing around yours website architecture.
If you decide that you do want to add the www. Then by all means let Google know that your making a change by telling them that you are changing domains Inside of Google Webmaster tools.
I know you are not changing the domain however you want to treat it just like you Are That Way, Google will come back and index your site quite frequently a lot more than it would otherwise.
When you change your link structure treat it like a domain change.
http://moz.com/blog/domain-migration-lessons
http://moz.com/blog/seo-guide-how-to-properly-move-domains
https://www.distilled.net/blog/seo/changes-of-domain/
It is going to take over 10% of your link juice away from anything going to the non-www. and add the same amount if you have a lot of powerful links going to www. it might be worth it.
But I still think you are looking in the wrong place for links.
Make sure your site is being indexed if you change it or if you do not.
https://www.distilled.net/blog/seo/indexation-problems-diagnosis-using-google-webmaster-tools/
Try press releases or other white hat methods.
all the best,
Thomas
-
There shouldnt be any problem with incoming links because of that.
As William said though, you will see some changes, but you will recover. Sometimes, it will take a long time to fully get Google to index the correct urls so don't jump the gun. Decide and stick with one.
-
You are basically 301 redirecting an entire site to a new URL (the "www" subdomain). So treat this like any other 301, you will dip, but it should recover for the most part.
In the future, I wouldn't recommend changing the www status after a suite is established, even if the preference changes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | | Michael_M2 -
Website excluded from indexing, google-selected canonical: N/A
The google search console revealed to me that none of my pages was indexed, all pages are listed in the 'excluded' section "duplicate, google chose different canonical than user".
Reporting & Analytics | | Fibigrus
But in the URL-inspection tab it shows me google-selected canonical: N/A Indexing and crawling is both allowed. Don't know how to get my pages to be indexed correctly. (by the way, they do NOT exist in other languages, so that can't be a reason why google might think they are a duplicate. There's definitively no other version of those pages available)0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Google analytics suddenly stopped tracking all my landing pages
Hey guys. I love the new update of GA. Looks so clean. So, of course, I was excited to see how my landing pages were doing. I went to behavior, all content, all pages. And I noticed it's only showing me 19 pages out of the 93 I have indexed. And none of the top ones at all! Can't find them anywhere in GA! Anyone seen this before? Thank you so much
Reporting & Analytics | | Meier0 -
Find Pages with 0 traffic
Hi, We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic? The minimum which appears in google analytics is 1 visit.
Reporting & Analytics | | driveawayholidays0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0