Don't understand this ... :-(
-
Hello,
I'm going nuts as I don't understand what's going on with this domain of a client.
We have this classical htaccess redirect
from http://domain.com to http://www.domain.com
But I'm getting Page Authority for both domains, and the non-www, which shouldn't be crawled, is gettting higher PA ..
http://www.myanamar.rundreisen.de - PA 34
http://myanamr-rundreisen.de - PA 36
I attach a file, you see there that google robot is recognizing the 301 redirecht from non-www to www ...
But, the site isn't doing good at all in google, it seems the home page has a penalty ... duplicate content due to non-www and www home page?
So it would be great if somebody has a hint for me ... my client is losing trust in me
Thx!
-
Thanks!
-
Matt Cutts talked about this a few years back....let me find it.
Basically where your server is (minus county specific) doesn't matter to Google.
Google understands that people share servers and it's not that important in the scheme of things. What does matter is server up time.
-
Thanks for your support! I think the last tool reports show a little improvement.
But one more information or possible problem(?): On the same server, in another directory, another site of the client is hosted, which has a very good Google standing for 6 or 7 years.
The HTML structure is similar, and it depends on the same CMS and similar CSS.
So could this be a problem for Google? Should the site be moved to another provider?
Once again thx
Guenter
-
Yes Agreed. I guess its a waiting game for him to see how effective it has been placed.
But in my instances rel=canonical always solved the problem for dup content.
Thanks Darin
-
Yes, both can get indexed especially if preferences and 301s weren't in place the last time Google crawled. I've noticed it takes time for Google to use the canonical on a page. I've seen it take 4 or 5 crawls for it to take effect correctly. But don't forget it's just a suggestion and not a directive. I think Google wants to make sure that it's in the best interest of the site before it adheres to it (just a guess)
Don't forget too that Google will only crawl a portion of a site when it crawls (especially for bigger sites) to make sure it doesn't take up to much bandwidth on your server. The home page may not have been crawled since the element has been put in.
-
Yes, thanks,
I forgot to mention, this was set some weeks ago and in Google's cached cersion the rel=canonical tag ist in the source code, so they should habe the newest page.
Just edited the post above a few seconds after your question
-
Yes, how long ago did you set this?
Has google since indexed your page
-
Thanks, I've set since a couple of weeks
<link rel="<a class="attribute-value">canonical</a>" href="[http://www.myanmar-rundreisen.de/](view-source:http://www.myanmar-rundreisen.de/)" /> That should be fine?
-
Thanks, yes, the preferred domain ist set to www
-
Darin has a good point. Set your preferences
Also Rel=canonical
Darin if i am not mistaken maybe you can shed some light , dont both pages still get indexed even if its redirected with a 301? I am sure a rel=canonical will solve the issue !
Best Wishes,
Hampig M
BizDetox
-
Have you set your preferred domain in Google Webmaster Tools?
(Make sure you have verified both versions of your domain)
Configuration > settings > preferred domain > radial for the www version
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
I'm planning the structure of a Car Parts e-commerce site and...
Hi all, As the title says, I am redeveloping a website for a client who sells car parts. The market is saturated with competition and 9 times out of 10, www.eurocarparts.com appears within the top 5 SERPS when searching for the same things that my clients' website sells. Now I know it will be very difficult to compete with the likes of these sites but given time (plenty of time) you never know so I have dissected Euro Car Parts website for many hours to look at their internal link structure and so my question is related to this area. My site only sells car parts for 4 car manufacturers. The left hand "shop" navigation menu will list the categories in which the shop sells products for e.g: Air Filters
On-Page Optimization | | yousayjump
Break Pads
Coilovers
Dampers
Suspension etc. When you view one of those pages, there will be a form to allow the user to filter down to their particular make/model of car etc. Now when I search for "Air Filters" in Google, the results come back as nearly 40 million. If I prefix that search term with a car manufacturer name i.e. "Audi Air Filters" the number drops down to 8 million - quite a difference!
So my thinking was to do the following but I wanted to see if you guys think I am barking up the wrong tree or if this is a good approach: Create pages for my 4 car manufacturers for each "shop category" listing all the products I have in those categories for the manufacturer along with well written unique, relevant content, ie:- Audi Air Filters
Audi Break Pads
Audi Coilovers
Audi Suspension ... this would allow me to target the slightly long-tailed version of "Air Filters" as I now have a page for "Audi Air Filters". This would then mean when users click the "Air Filters" link from my left hand category menu, I would need to ensure that those pages were not indexed by the search engines as they would essentially be showing the same subset of products but with the title of the page being "Air Filters". I hope I have explained myself enough for you to understand my question. Ultimately I want to know if my approach is a typical one when knowing that even attempting to target "air filters" with a new website is going to be a lost cause - I need to try and get some of the lower hanging fruit. Thank you for reading.0 -
How to view all 'followed internal links' on a page
I am trying to view all the followed internal links on a few pages of my website. The MOZ toolbar just gives me the total number of internal followed links. What is the best way to actually see all the internal links that are followed by the google bot from any particular page? Thanks in advance.
On-Page Optimization | | rjchugh0 -
Using example.info when example.com is a link farm. Ok? Bad? Doesn't matter?
Second question of the day- I'm helping a friend with his law firm site. He is using example.info because example.com is being used by a link farm. Is this hurting his search efforts? Thanks
On-Page Optimization | | ahossom0 -
Understanding SEOMOZ and how to make improvements to my website
I want the same as everyone else the best position in Google. My campaign is set up as sub-domain. I am trying to get an understanding of what make the improvements to the following. What are the most important metrics and if I can work to improve them what makes those improvements. Currently I have a good position in Google for my most productive phrase but feel I need to make improvements to ensure I keep that position. I am adding regular unique content and looking for relavant quality links but it does not seem to be making much difference to the metrics below. Domain Authority 16 Domain MozRank 2.83 Domain MozTrust 0.86 External Followed Links 16 Total External Links 55 Total Links 668 Followed Linking Root Domains 8 Total Linking Root Domains 9 Linking C-Blocks 3 Subdomain MozRank 2.95 Subdomain MozTrust 1.73 External Followed Links 15 Total External Links 54 Total Links 667 Followed Linking Root Domains 8 Total Linking Root Domains 9 Which are the important metrics and can I improve my site to increase my Moz position therfore improving Google results
On-Page Optimization | | PhilSmith230 -
Panda Smacked - now it's your turn
Hi all Ok so we were smacked by Panda way back in June 2011, and are recovering from it, (though definitely still not back up to pre-Panda levels). Since then we have: 1. Taken down a load of thin content pages. 2. Increased content. 3. Tried to reduce page template complexity. However, one of the issues we have is that we make money from Adsense, so don't want to reduce the number of ads - however, we may still be falling foul of Panda because of it. So, please take a look at this sample page and tear it /us apart: http://www.compactlaw.co.uk/free-legal-information/children/children-act-orders.html Thank you. And if we can ever help the community back, please just ask.
On-Page Optimization | | dexm100 -
Different pages for OS's vs 1 Page with Dynamic Content (user agent), what's the right approach?
We are creating a new homepage and the product are at different stages of development for different OS's. The value prop/messaging/some target keywords will be different for the various OS's for that reason. Question is, for SEO reasons, is it better to separate them into different pages or use 1 page and flip different content in based on the user agent?
On-Page Optimization | | JoeLin0 -
I think my site's HTML is good but I get 22 Invalid markup erros?
Most are all related to things like facebook like buttons and such. I'm using DOCTYPE 4.01 Traditional but no good. Any ideas? www.jaaron-wood-countertops.com
On-Page Optimization | | JAARON0