Links to www vs non-www
-
I was having speed issues when I ran a test under Google Page Speed test and, as a result, switched to using Google Page Speed Service. This meant I had to switch my site from the non-www to the www. Since the switch my page is running faster but my ranking has dropped.
What I'm trying to find out is the drop due to all of my previous links going to the non-www or is it because of the site being considered new and is more of a temporary issue. If it is a link issue I will contact everyone I can to see who will update the site address.
Thanks everyone!
-
Andy, JCurrier and Derek
Thanks so much for all of your help. Much appreciated!
-
Doesn't seem like anyone has mentioned it but, just in case, make sure you have your rel="canonical" tags published properly on every page. Here's a link for more if you need it:
-
If everything is in place Megan, then you should have no problems. Google will be able to work the links out without penalty. As JCurrier said, expect a little movement when you make changes, but it doesn't sound like anything much and should right itself. If it doesn't, then there may be something else going on as well.
-
Hi Megan,
As long as you have made all of the appropriate changes, you should be fine. The drop is probably due to the change, but you should see the rankings recover. I think you can expect some shuffling to occur when changes like these are made.
-
Hey Andy!
thanks for the response. I do have a redirect in place and have made all the appropriate changes in webmaster tools. Google recognizes the site on www but I saw a drop (5 spots) since the change.
Im wondering if Google treats www links and non-www as the same or different, or perhaps the drop is just a temporary one due to the "new" domain.
thanks in advance for any insight you can provide.
-
Hi Megan,
You shouldn't really have two live versions of your site. Apart from anything else, this causes duplication issues. What I would do is correct this by having someone create a redirect in the .htaccess file so that when anyone does hit the non-www version, they will automatically be forwarded to the www version.
That being said, you could always look at page caching to speed the load times up. Perhaps even look to have large images fed from an alternative source such as Flickr?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Does Navigation Bar have an effect on the link juice and the number of internal links?
Hi Moz community, I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100. I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer. Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice? Here is the website if you want to take a look: http://www.goldstore.com.tr Thank you for your answers.
Intermediate & Advanced SEO | | onurcan-ikiz0 -
One Site vs. Many
This is a question that I am not sure has a "right" answer. I am just wondering what everyone's thoughts are on this. I can see benefit of both sides of the coin. In your opinion, is it better to have one large e-commerce site with all of your content on the same domain or is it better to have multiple more targeted domains with your content broken up into smaller chunks? The reason I ask is, I feel like while multiple more targeted sites certainly have the benefit of focus, aren't you taking all your traffic and content, splitting it up and leaving you with several sites that most likely are getting less traffic than one large site would. All opinions welcome.
Intermediate & Advanced SEO | | unikey0 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
Static links google guidelines
Google recommends to have static links it in guidelines Are breadcrumbs and static text link the same ? or in addition to breadcrumbs do I need static links on my page going from page A to B etc... The issue I have with static links this way is that if I look at the PR paper that would decrease the juice of my homepage ( which is the page I want to give the most juice to ) Thx,
Intermediate & Advanced SEO | | seoanalytics0 -
Toxic Links; Their Existence and Their Impact..
We are constantly being asked about the existence of “toxic Links” and that they are damaging the sites of our clients. Apparently, this definition is being pushed down the throats of clients by other “Seo experts” trying to hijack our business. At this point in time, clients can easily be swayed as a reflex reaction to a drop in rankings. These so called “Seo experts” are clearly scaremongering for their own gain but I would be grateful for your opinion about whether automated, spun content from Seolinkvine and the like, where the English may not be perfect (I assume this is what is meant by “toxic Links”) can actually damage a client’s site. Is it not more constructive to concentrate resources on dilution of keywords from the anchor text rather than waste time on links that may no longer be as powerful, or do they actually have a negative effect?
Intermediate & Advanced SEO | | Dexter-2455780 -
Competing with Spammy Links
One of my client's leading competitors is well stacked in terms of rank/authority. PA: 61, DA: 53. However, in OSE I estimate that +/- of all links on the first page are from sites such as "http://www.shopp011.freedownloadhub.com/Link-Exchange/browse.php?id=17", "http://www.shopp002.freedownloadhub.com/Link-Exchange/browse.php?id=17", "http://www.shopp029.freedownloadhub.com/Link-Exchange/browse.php?id=17". Personally, I would consider this to be a little spammy. However, I admit that I could be wrong. What's the best approach when trying to take on a competitor like this? Wait it out and tell my client to keep blogging/selling as per the schedule until Google pics up on these links?
Intermediate & Advanced SEO | | ShippingContainer0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0