Should I be running my crawl on our www address or our non-www address?
-
I currently run our crawl on oursitename.com, but am wondering if it should be run on www.oursitename.com instead.
-
It does make sense. Based on what you wrote, ours is not set up correctly. But now I have the language to better work with our provider.
You are very helpful!! Thank you!
-
There's really no way to avoid the existence of the two different addresses, THMCC - they're a byproduct of the way of the way web servers and domain names work. They both exists as soon as a website is created. And visitors tend to use them interchangeably.
They aren't "actually" two different sites, they are just two different addresses that refer to the same pages, causing the search engines to see them as duplicate content. Kind of like how your house's location can be described by the lot number on the city plan, or by the postal address. Same house - different ways of showing where it's located.
If the 301 redirect is done correctly, the search engines will understand that everything should be considered to be at the one primary address and they'll pass along the other version's authority to the primary version automatically. And therefore no second version to compete with.
You can easily tell if the redirect is working properly. Let's assume you decide for example that the www.thmedicalcallcenter.com version is the primary and redirect the non-www version to it. When you type in the non-www address into the browser's address bar and hit enter, you should actually see the URL in the address bar change to the www version of the address.
And yea, you absolutely must have both addresses taken care of. You have no way of controlling whether someone types in the address with or without the www. and you want either one to get forwarded on to the primary address.
Does that all makes sense?
Paul
-
Thanks so much, Paul!!
The SEO Moz helpdesk is who told me that I needed to have oursitename.com in addition to www.oursitename.com. Is this true? Creating oursitename.com and the redirect seems to have caused a lot of errors, mostly duplicates from the redirect probably not being done correctly. But overall, was/is it in our best interest to have both www.oursitename.com and oursitename.com?
-
Thanks again!! One more question if you don't mind. If the 301 redirect is done correctly will this fix the competition issue in all the search engines? If not, should I get rid of oursitename.com?
-
As far as the search engines are concerned, thmedicalcallcenter.com and www.thmedicalcallcenter.com are two separate sites that will compete against each other and dilute each others' authority and ranking unless one is 301-redirected to the other.
As SEO5 indicates, it's best to assess which version already has the most incoming links and use that version as the primary, redirecting the other one to it.
This is best done using a 301 redirect written into your site's .htaccess file. In addition, there is spot in Google Webmaster Tools (GWT or WMT) where you can also hint to Google which version of the site you want to be the primary.
It's not enough to only use the hint in GWT as that only applies to Google, it will do nothing to correct the problem in the other search engines.
Paul
-
I thought they were the same...with a 301 redirect from oursitename to www.oursitename.com so I am confused as to why one would have more inbound links than the other. Also, what is WMT?
Thanks!!!
-
Pick the one that has more inbound links associated with it and already has high pages in the index. Make sure you set the preferences in WMT to either the www. or the non www version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
If links have been disavowed, do they still show in crawl reports?
I have a new client who says they have disavowed all their bad links, but I still see a bunch of spammy backlinks in my external links report. I understand that disavow does not mean links are actually removed so will they continue to show in Google Webmaster Tools and in my Moz reports? If so, how do I know which ones have been disavowed and which have not? Regards, Dino
Moz Pro | | Dino640 -
Slowing down SEOmoz Crawl Rate
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit. If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it. Thanks.
Moz Pro | | corwin0 -
Dot Net Nuke generating long URL showing up as crawl errors!
Since early July a DotNetNuke site is generating long urls that are showing in campaigns as crawl errors: long url, duplicate content, duplicate page title. URL: http://www.wakefieldpetvet.com/Home/tabid/223/ctl/SendPassword/Default.aspx?returnurl=%2F Is this a problem with DNN or a nuance to be ignored? Can it be controlled? Google webmaster tools shows no crawl errors like this.
Moz Pro | | EricSchmidt0 -
Is seomoz rogerbot only crawling the subdomains by links or as well by id?
I´m new at seomoz and just set up a first campaign. After the first crawling i got quite a few 404 errors due to deleted (spammy) forum threads. I was sure there are no links to these deleted threads so my question is weather the seomoz rogerbot is only crawling my subdomains by links or as well by ids (the forum thread ids are serially numbered from 1 to x). If the rogerbot crawls as well serially numbered ids do i have to be concerned by the 404 error on behalf of the googlebot as well?
Moz Pro | | sauspiel0 -
Crawl Test produced only 1 page
Hi, I recently submitted a crawl for www.cirrato.com using SEOMoz Crawl Test Tool. I have a lot of pages, but the crawl result shows only 1 page, which is the front page and nothing else... Does anyone know what this could mean or what the problem is?
Moz Pro | | yusufcirrato0 -
SEOMoz Crawl Warnings, do they really hurt rankings?
SEOMoz reports 250 crawl warnings on my site. In most cases its too long title tags, with 4 of them its missing meta description. SEOMoz says it will hurt my rankings? However, I'm sure a recent whiteboard Friday contradicted this. So what is it?
Moz Pro | | sanchez19600 -
Excluding parameters from seomoz crawl?
I'm getting a ton of duplicate content errors because almost all of my pages feature a "print this page" link that adds the parameter "printable=Y" to the URL and displays a plain text version of the same page. Is there any way to exclude these pages from the crawl results?
Moz Pro | | AmericanOutlets0