Www and non www how to check it.......for sure. No, really, for absolutely sure!!
-
Ok, I know it has been asked, answered, and re-asked but I am going to ask for a specific reason. As you know, anyone who is a graphic designer or web developer is also an expert in SEO....Right???
I am dealing with a client who is clinging to a developer but wants us to do the SEO on a myriad of sites. All connect to his main site via links, etc. The main site was just redeveloped by a developer who claims extensive SEO knowledge. The client who referred me to them is getting over twenty times the organic clients they are and is in a JV with the new client. Soooo, I want to show them once and for all they are wrong on the www. versus non-www.When I do a Site:NewClient.com in Google I get a total of 13 www.newclient.com url's and 20 newclient.com url's without the www. Oddly, none are dupes of the other. So, where the www.NewClient/toy-boat/ is there, the other might be non www. NewClient/toy-boat/sailing-green/
Even the contact page is in the www.NewClient/contact versus the non www of NewClient/Contact-us/
But, both pages seem to resolve to the non www. (A note here is that I originally instructed the designer to do non www to www. because the page authority was on the www.NewClient and he did opposite.
With pages that are actually PDF files, if you try to use the www.NewClient/CoolGuy.pdf it comes up 404.
When I check our sites, using Site:We-Build-Better.com ours return all www.We-Build-better/ url's.
So, any other advice on how to insure these correct or incorrect? Oddly, we have discovered that sometimes in OSE, even with a correct canonical redirect it shows one without authority and the other with....we have contacted support.
Come on mozzers, hook a brother up!
-
Hi again Robert,
God of All Things Code is away from the office for a while today, so will need to wait a little longer for his input.
A couple of things that happened since my last post though:
Those twitching antennae just wouldn't stop nudging me to look a little further as everything I see with this site is saying "template" to me. Add to that the URL rewrites which hide the actual URL's and the broken pdf files...so i went digging a little further and ... Aha!
Not a template, but a "Theme". The entire site is built in Wordpress!
Now, I am pretty sure that the broken pdf's are the result of the Wordpress URL rewrites changing the directory name in combination with the hard coded links. If this is the case, then it ought to be just a matter of adding a rule to the .htaccess file to deal specifically with the pdf's. The order in which the rules appear will determine whether the issue is resolved or not.
I'll let you know as soon as I've confirmed the specifics with my Boss.
Hope that helps,
Sha
-
Well done, good point on pref setting in WMT. Thanks,
-
OK Ryan, you don't sleep and that was funny ;).
-
OK Robert,
First I'm going to tip my hat to Ryan, who has perfectly explained the fact that some of what you see in your site: search can be because the 301's have not yet been recognized by the search engine.
Second, an apology to Alan as I went right to the LAMP solution because of prior knowledge from a previous thread or two that you were going to be talking about .htaccess
Now...I will spell out a couple of things because I have a feeling that you are likely to come across them again in the future and quick recognition can often mean a lot of time saved.
So here goes.
When I first read your question, my little web developer antennae suddenly started twitching! When I hear that there are multiple versions of a file with different file names deployed on a server I generally suspect one of two things:
- The site has been developed from a standard Template package, or
- There has just been a little "untidiness" taking place in the development process.
In your example, the /contact.php was the original file deployed live to the server, then the /contact-us.php file was created to replace it (presumably for SEO purposes - debatable, but that is a whole other conversation). As I'm sure you can imagine, /contact is pretty common in template packages, although the biggest template producer out there is much easier to spot, as the pages in their templates are always in the format /index-1.htm etc. It may just be that the developer creates their own standard template from an original design and rather than pre-planning and creating the file names to maximize SEO, they create standard page names and change them later.
While there is nothing really wrong with either of these things (unless you are charging the client for an original design and buying a pre-designed template at a fraction of the cost), both methods do open up the way for mistakes and errors to occur. As a result, there are a few things to keep in mind if you are working this way -
- It is a much better idea to build on a development server so that none of the files that will become obsolete during the process will be indexed by search engines in the meantime. Tidy architecture, remove the obsolete files, test, then push to production.
- When changing file names it is ALWAYS better to re-name the existing file and do a global update of links rather than create a duplicate with a different name. As soon as you create two files, you open up the possibility of accidentally linking both files within the site. You could have /contact.php linked from the home page and contact-us.php linked from the footer for example. There is a danger here that should you decide to delete the unwanted file, you create broken links without knowing it, or you have duplicate content. Either way, you have to recognize the problem and either fix it, or put a 301 in place to catch it.
- NEVER hard code your links, because as soon as you change the name of the directory you placed your files in, you create a broken link! If you use relative links, the change of directory name will not matter.
I can see from Screaming Frog that some of the URL's for the pdf files have 301's in place, but it appears that the Redirect URL may also be hard coded to the /pdfs directory. The fact that they all return a 404 when the directory name is changed to match that section makes it purely a guess as to what is happening here. It seems both www and non www pdf's are returning 404's in the browser.
The picture is muddied a little by the fact that there appear to be internal URL rewrites in the mix as well (to produce those pretty URL's with trailing slashes). So, there are a few options as to why the pdf's are not accessible:
- They are not actually on the server at all (unlikely)
- The names of the pdf's themselves have been changed, so even if the URL rewrite is sending the request to the new directory, the file requested does not exist.
- The /pdfs directory has been named something completely different and the hard coding is the problem
- The /pdfs directory has been moved to another location within the site architecture
I tried guessing a couple dozen of the obvious options, but no luck I'm afraid
There is one other possibility, in that the internal URL rewrites and 301 redirects could be creating a problem for each other. I am not clever enough to identify whether this is the case without a hint from the code, but will ask the God of All Things Code (my Boss) if he can answer that for me when daytime arrives 8D
OK....this is now so long that I really need to read the whole thread back to see if I have forgotten anything! If I find something I have missed, or can find anything else when help arrives, I'll be back!
Hope it makes some sort of sense and ultimately helps,
Sha
-
This info is really not browser dependent, just displayed differently.
But as i stated elswhere, if you PM me the Url i can give you a site wide report that will show you any cononical problems, or any problems for that matter.
-
Thanks for this Alan, I use Linux / Apache but having the IE info is a big help. Usually have Chrome or Firefox up, but some real estate sites here only use IE.
-
I want a sure way to know this ...person....did what they are telling their client they did.
Perhaps someone has more creativity then myself but I do not know any means by which you can be 100% certain a sitewide 301 is implemented without seeing the file on the server. The "file" varies based on the server type. As you know, for Apache servers the .htaccess file is the right one.
Even if you saw the .htaccess file, it is possible for another file to overwrite the command. The way I always have verified is by looking at the site itself. Check the home page and a few other pages. If they are all 301'd properly, then I presume the developer performed their job correctly. It would actually be a lot more work for the developer to attempt to fool you by 301'ing part of the site but not all.
I also suggest ensuring your site's www or non-www standard appears correctly in your crawl report.
Is my assumption that if a 301 was done in .htaccess, there should be no www showing in Google Site:?
That is not necessarily true. If you have a site which shows mixed URL results, then overtime the results from a site: search will be standardized, but it will take time as Google needs to crawl each and every page of the site and see the 301. Also if any page is blocked by robots.txt for example, then Google may not see the 301 for that page and still list the old url.
If you changed the Google WMT preferred domain setting, then it is true you will only see one version of the URL. I would specifically advise you NOT to change that setting in this case as it may cover up the developer's issue which you are trying to locate. As for now, you can wait 30 days and perform a site: search. Investigate any bad URLs you find.
-
If you want Robert, if you PM me the url, i will give you a site wide check
-
I shot you a PM. Just dont want the other guys info out. If it was my site and I had full control would tell all. Sha got one too. Thanks
-
I shot you a PM. Just dont want the other guys info out. If it was my site and I had full control would tell all. Sha got one too. Thanks
-
We are linux on all though. So the .htaccess file is the bomb with a 301 and we follow up with setting preference in Google webmaster tools.
-
Private Message Ok, should of been obvious
-
Well if Robert Private Messages Sha, then you would be missing that message
-
Sha, what does PM stand for? Am I missing somthing?
-
just a point, you dont need to do a 301 in the .htaccess file.
I work with Microsoft Technolgies, and we dont use them, .htacces is a linux appache thing
-
Unfortunately, the other developer controls all. We develop a set of sites that are essentially micro sites that advertise particular facets of our clients professional practice. With our sites when we have the main site and the micro sites, we make the 301 change in the .htaccess and then set the preference with Google per webmaster tools. We look first to see where the page authority lies and redirect from weak to strong if just for www/non www. With a new TLD, obviously, it is from old to new.
I want a sure way to know this ...person....did what they are telling their client they did. It does not appear so. With ours when we do a site:OurSite we get what we assumed on every page of Google search. With this one it is four pages with the 13 www and 20 non www. Some www urls resolve to the non and some do not. When I look in OSE, I see where there is mention of a redirect from www to non www, and the non www all with PA of 1, DA of 15. With www, PA is 25 for home page.
Is my assumption that if a 301 was done in .htaccess, there should be no www showing in Google Site:?
Thanks
-
no .htaccess file access.......hard to even get to site pages to place links to microsites.
I will PM the url. Thanks Sha
You are correct, I want to know did this other developer really do a 301 in the .htaccess file that will allow all weight to inure to one or the other url.
-
no .htaccess file access.......hard to even get to site pages to place links to microsites.
I will PM the url. Thanks Sha
You are correct, I want to know did this other developer really do a 301 in the .htaccess file that will allow all weight to inure to one or the other url.
-
no .htaccess file access.......hard to even get to site pages to place links to microsites.
I will PM the url. Thanks Sha
You are correct, I want to know did this other developer really do a 301 in the .htaccess file that will allow all weight to inure to one or the other url.
-
Hi Robert.
Once you determine which version of a URL you would like to represent your site, the best method to enforce that decision is to use a 301 redirect. For example, direct all non-www traffic to the www version of the URL the same way SEOmoz URLs appear. With this approach, 100% of your URLs will appears as the "www" version in SERPs and there will never be any confusion or conflict.
I've heard people talk about using canonicals or setting the preferred domain in WMT. Neither step is necessary as long as the 301 is in place. The reason I still do both is I like to account for failures in a process. You never know when someone will make an error and modify an .htaccess file incorrectly and wipe out your redirect.
If you have the redirect in place, OSE and similar tools should clearly see the redirect and act appropriately every time. If the tool does not work correctly, I would examine the header tag of the page to ensure the 301 is working properly. If it is, then I would perform the same action you did and report the bug.
If you do not take the proper steps to enforce a "www" or "non-www" structure, you will see the results which you described. Some users will visit and link to each version of the page which will lead to both versions of URLs being indexed. Google will index a version based on which was discovered first or which version it deems more important based on links and other factors. When you perform searches for a site, some URLs will appear with the "www" and some without it. The backlinks will be divided and, as you know, that is bad for SEO. The duplicate content issue will set off alarms for the SEOmoz crawler and similar tools, but Google will still index one version of the page.
I am not sure if this completely answers your question Robert. If I missed anything, feel free to ask.
-
Thwe SEO Toolkit sees the same probnlems as Bing sees, you need windows and you need to install IIS (add features) first
http://www.iis.net/download/SEOToolkit -
What is the SEO Toolkit that runs on Windows?
Best,
Christopher -
Hi Robert,
OK, just to clarify...
- You want to check for sure that newclient.com is 301 redirected to www.newclient.com?
- You want to check for sure that ALL URL's which have been individually 301'd are redirecting to www.newclient.com/filename?
- You want to understand why the non www version of pdf files works and the other doesnt?
Right off the top, the definitive way to check whether there is a properly functioning redirect in place is to type the URL into a browser and see whether it resolves to the redirect target :). You can also run Screaming Frog and see what status the pages return, but be aware that this does not always reflect the real situation in the browser (pages can return status that does not match what you see).
On the other questions, I think perhaps what you really want is to first determine what is happening and then, WHY?
So, first things first:
- do you have access to the .htaccess file?
- Can you provide the URL (and .htaccess if you have it)? You can PM this info if you don't want to share it publicly.
Sha
-
Not quite sure I understand what you want to check, but as long as one 301's to the other it does not really matter. it may take some time for SE's to catch up.
Are you saying you want to check if it is resolving correctly?
in IE click F12, and then select Network and start capturing, you will see if its useing a 301, or a useless 302.If you want to prove to your client that the developer is not on the ball, do a scan with the SEO Toolkit and show the results, if you dont have windows too install it on, i will do one for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
#! (hashbang) check help needed
Does anybody have experience using hashbang? We tried to use it to solve indexation problem and I'm not fully sure do we use right solution now (developers did it with these FAQ and Guide to Ajax crawling as information source). One of our client has problem, that their e-shop categories, has solution where search engines aren't able to index all products. In this example a category, there is this "Näita kõiki (38)" that shows all category products for users but as I understand search engines aren't able to index it as /et#/activeTab=tab02 because of #. Now there is used #! (hashbang) and it is /et#!/activeTab=tab02. Is this correct solution? Also now example category URL is defferent for better indexation with:
Intermediate & Advanced SEO | | raido
/et#!/
../et And when tabs "TOP ja uued" and "Näita kõik" where activated/clicked then:
/et#/activeTab=tab01
/et#/activeTab=tab02 I tried to fetch it in Google Webmaster Tools but it seems it didn't work. I would appreciate it if anybody can check this solution?0 -
Tool to bulk check outbound links
Hi. I have a list of 50 domains I need to check for links to three different sites. Does anybody know an easy way to do this? The best solution I have found so far is to crawl each with Screaming Frog and search for the domains, but I can only do one at a time this way. Some way to speed it up would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Really bad technical SEO and Nofollow
I posted a question week ago about a client with really awful SEO errors to the tune of over 75k violations including massive duplicate content (over 8000 pages) and pages with too many links (homepage alone has over 300 links), and I was thinking, why not try to nofollow the product pages which are the ones causing so many issue. They have super low domain authority, and are wasting spider energy, have no incoming links. Thoughts? BTW the entire site is an ecommerce site wth millions of products and each product is its own wordpress blog post...YIKES! Thoughts?
Intermediate & Advanced SEO | | runnerkik1 -
I need help with setting the preferred domain; www. or not??
Hi! I'm kinda new to the SEO game and struggling with this site I'm working on: http://www.moondoggieinc.com I set the preferred domain to www. in GWT but I'm not seeing it reroute to that? I can't seem to get any of my internal pages to rank, and I was thinking it's possiblly b/c of a duplicate content issue cause by this problem. Any help or guidance on the right way to set preferred domain for this site and whiy I can't get my internal pages to rank? THANKS! KristyO
Intermediate & Advanced SEO | | KristyO0 -
Www. or not?
Hi, I am pretty new to SEO. I know I have made so many mistakes and I am trying to learn now here on SEOmoz. At this moment, I am trying to find the answer to a situation that puzzles me. Since English is not my native language, I will do my best to be clear. I hope that a seasoned SEO expert out there can help me with this. I have a website (http://www.pokeronlineitalia.com) that in the very beginning I optimized for the URL http://www.pokeronlineitalia.com. I was even able to list this URL into the DMOZ directory. The problem is that after a while the web hosting company I use - I don't know why yet - decided to make my site available as http://pokeronlineitalia.com. This means that if a person types in http://www.pokeronlineitalia.com, she gets redirected to http://pokeronlineitalia.com (the traditional URL without the "www"). Since when I started the site I was even more clueless than I am now about SEO, I thought that that would not make any difference. Even the guy at the web hosting company told me that that would be ok in terms of SEO. I thought that that was the case and closed an eye. The fact is that now I am using several tools to measure the strength of my site in the eyes of the search engines and noticed that http://www.pokeronlineitalia.com seems stronger. Using a the tool I cannot mention, I compared the two URL and found out this: The total number of unique domains with backlinks pointing to a specific page is: 137 for the site WITHOUT the "www" and 2,210 for the site WITH the "www" The total number of links pointing to a specific URL is: 864 for the site WITHOUT the "www"; and 61,871 for the site WITH the "www" The number of domain backlinks is: 66,928 for the site WITHOUT the "www"; and 61,892 for the site WITH the "www" Moreover, as I have said, http://www.pokeronlineitalia.com is listed in DMOZ while the site without the "www" is not. My questions are the following: Is the site WITH the "www" in the URL stronger than the site WITHOUT the "www"? If so, can the guys at the web hosting company fix the issue and make sure that a person can go straight to http://www.pokeronlineitalia.com without being redirected to the site without the "www"? Finally, if the issue gets fixed, will the site with the "www" be indexed normally by the search engines? I have noticed that pages in the site with the "www" do not get indexed. Is it bad in the eyes of the search engines to go back to http://www.pokeronlineitalia.com? What would you do? I would really appreciate the opinion of an expert on this issue. Thank you for reading such incredibly long question. All the best, Mark
Intermediate & Advanced SEO | | salvyy0 -
301 or 302 - www.yoursite.com/uk/content
If your website CMS forces you to redirect from the homepage should it be a 301 or 302 Example includes www.direct.gov.uk which 302's it My view is that it should be a 302 in this instance and almost all others should be a 301 - the reason for this is that you want the www.direct.gov.uk to be the "primary" and one that is displayed in Google, whereas for anything else you want the URL of the location. Yes I know that ideally you don't have any redirection at all...
Intermediate & Advanced SEO | | AxonnMedia0