Ask a Question
-
Using SEOmoz for the first time, the initial crawl said we have 9,00 errors which were basically 4,500 duplicate pages and 4,500 dupllicate page titles. (ie http://domainname/etc .html, and http://www.domainmname/etc .html
We altered our website accordingly by changing all internal links to http://www.domainmname/etc .html as Google and all other rngines are listing us using the www. prefix.On the next crawl we now have even more of these duplicate errors. How d we go about removing them as we only have one file for each on the server.
Google has down graded our website in April by 35% and ass this is a retail site we are losing a lot of business.
I would very much appreciate it if anyone has the time to amswer.
Howard
-
Thx Alsvik,
Yes I have just done this. We had originally set this up with Google but somehow the site owner verification for the url got removed and Google had reverted to No preference. Hopefully this will sort the problem
-
Have you changed the preffered domain suffix in GWT? If you exclude one of the two (domain.com or www.domain.com) google will not register duplicate pages/content on ie /etc And add a redirect from domain.com to www.domain.com That should fix the issue
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF Optimization Question: Does URL Structure Matter?
Hi Mozzers: I am optimizing a bunch of PDF brochures within a client's website. Besides the typical optimization tactics I'm applying, (like these) I have a question regarding the file/url structure of the PDFs themselves. By default, the client is locating PDFs in an 'uploads' folder of their Wordpress site. So, a typical PDF might have a URL such as: https://www.Xyzinsurance.com/xyz-content/uploads/2015/06/Brochure-XYZ-Connect.pdf My question: is there any advantage in eliminating all these sub-directories and moving the files into a main folder, simply titled '/brochures' ?? Any insights or conjecture would be welcome!
Technical SEO | | Daaveey0 -
SEO Question - Are 503/504 errors an issue?
Lately I've noticed more and more 503/504 errors being flagged in my MOZ reports. One week I had over 1300 errors show up. I checked Google Webmaster Tools and Bing Webmaster tools and noticed they were showing up in there too, although not near as many (50 or less per day). I contacted my hosting company about it and they said these were normal and that it was due to one nameserver reaching capacity, but that there was a backup nameserver that kicks in. I've seen one or two of these errors show up before, but never more than one or two a week. Is this something I should be concerned about?
Technical SEO | | Kyle Eaves0 -
Subdomain question for law firm in Indiana, Michigan, and New Mexico.
Hi Gang, Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc. We currently are set up with the main site as: http://www.2keller.com (Indiana) Subdomains as: http://michigan.2keller.com (Michigan) http://newmexico.2keller.com (New Mexico) My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought. Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it. Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage. Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly. I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales? I guess I'm wondering if there are some things I'm overlooking here? Thanks guys/gals!
Technical SEO | | puck991 -
Question About Using Disqus
I'm thinking about implementing Disqus on my blog. I'd like to know if the Disqus comments are indexed by search engines? It looks like they are displayed using Ajax or jQuery.
Technical SEO | | sbrault740 -
A few misc Webmaster tools questions & Robots.txt etc
Hi I have a few general misc questions re Robots.tx & GWT: 1) In the Robots.txt file what do the below lines block, internal search ? Disallow: /?
Technical SEO | | Dan-Lawrence
Disallow: /*? 2) Also the sites feeds are blocked in robots.txt, why would you want to block a sites feeds ? **3) **What's the best way to deal with the below: - old removed page thats returning a 500 response code ? - a soft 404 for an old removed page that has no current replacement old removed pages returning a 404 The old pages didn't have any authority or inbound links hence is it best/ok to simply create a url removal request in GWT ? Cheers Dan0 -
New Website and Domain Question
Hi all, I am launching a new website around the end of October and I have purchased a great domain to use for it. My question is should I put some kind of holding page up to try and start building up some domain authority in preperation for launch? Or maybe a blog at www.domain.com/blog and then keep all the blog content at the same location when the full site goes up? Or is it best to wait and just launch the site when the first version is complete? Thanks, Ben
Technical SEO | | BenInder0 -
Question Concerning Pages With Too Many Links:
I have run SEO moz software for a clients site, Its showing that virtually every single page has too many links. For instance this url: http://www.golfthere.com/AboutUs Am I missing something? I do not see 157 links on this page.
Technical SEO | | ToddKing0 -
Site Hosting Question
We are UK based web designers who have recently been asked to build a website for an Australian Charity. Normally we would host the website in the UK with our current hosting company, but as this is an Australian website with an .au domain I was wondering if it would be better to host it in Australia. If it is better to host it in Australia, I would appreciate if someone could give me the name of a reasonably priced hosting company. Thanks Fraser
Technical SEO | | fraserhannah0