Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • One of our sites is very large - over 500M pages.   Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more.  This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity.   At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google  bandwidth without compromising user experience.   My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim?  I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates.   This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past.   Is that harmful?  I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps?   Would that reduce the daily crawl demand?  We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks

    Intermediate & Advanced SEO | | lzhao
    0

  • Does it matter if the description is different on Google local and all the citation websites? Some websites allow a lot of description, some don't. So my question is it only the company name+address that needs to be the same across the board or the description has to be the same too?

    Local Listings | | surfsup
    0

  • Hi GuysMy father is currently using a programmer to build his new site. Knowing a little about SEO etc, I was a little suspicious of the work carried out. **Anyone with good programming and SEO knowledge, please offer your advice!**This page http://www.thewoodgalleries.co.uk/gallery-range-wood-flooring/ which is soon to be http://www.thewoodgalleries.co.uk/engineered-wood/ you'll see has a number of different products. The products on this particular page have been built into colour categories like thishttp://www.thewoodgalleries.co.uk/engineered-wood/lights-greys http://www.thewoodgalleries.co.uk/engineered-wood/beiges http://www.thewoodgalleries.co.uk/engineered-wood/browns http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks This is fine. Eventually when we add to our selection of woods, we'll easily segment each product into "colour categories" for users to easily navigate to. My question is  - Why do I have 2 different URL's for the same page - is this good practice? Please see below... Visible URL - http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/Below is the permalink seen in Word Press for this page also.Permalink: http://www.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/and in the Word Press snippet shows the same permalink urlCipressa | Engineered Brown Wood | The Wood Gallerieswww.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/ Buy Cipressa Engineered Brown Wood, available at The Wood Galleries, London. Provides an Exceptional Foundation for Elegant Décor, Extravagant .. If this is completely ok and has no negative search impact - then I'm happy. If not what should I advise to my programmer to do? Your help would be very much appreciated. Regards Faye

    On-Page Optimization | | Faye234
    0

  • I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!

    Intermediate & Advanced SEO | | TheaterMania
    0

  • Is it legit to show different content to http request having different referrer? case a: user view one page of the site with plenty of information about one brand, and click on a link on that page to see a product detail page of that brand, here I don't want to repeat information about the brand itself case b: a user view directly the product detail page clicking on a SERP result, in this case I would like to show him few paragraph about the brand Is it bad? Anyone have experience in doing it? My main concern is google crawler. Should not be considered cloaking because I am not differentiating on user-agent bot-no-bot. But when google is crawling the site which referrer will use? I have no idea, does anyone know? When going from one link to another on the website, is google crawler leaving the referrer empty?

    Intermediate & Advanced SEO | | max.favilli
    0

  • I am wanting to do analysis of a whole bunch of URLs at once - i know ahrefs already has a very good tool for this - but I don't really want to have to pay the $79 a month if moz has something similar? I know it has the OSE. Can I do batch analysis in this? Thanks

    Link Explorer | | SWD.Advertising
    1

  • Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also  another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete

    Intermediate & Advanced SEO | | PeteC12
    0

  • Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok.  Of course, it is normally NOT ok,  I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location.  The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen.  OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url),   So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results   The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location.   If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think?  Are these scenarios a concern for getting penalized by Google? Thanks, Ted

    Intermediate & Advanced SEO | | friendoffood
    0

  • Hey guys ! I have a wordpress website and also yoast seo plugin . I've set up a meta title which is : TV Online | Assistir Filmes| Notícias | Futebol |GogsTV . (I checked on some free tools to see , and they also show up this) but .... google is showing this : GogsTV: TV Online | Assistir Filmes| Notícias | Futebol . Seems they are trying to show my brand name first instead of my main keyword . I'm not sure why it doesnt indexes as i want ... Does anybody know how can i fix this . Thanks

    On-Page Optimization | | tiagosimk
    0

  • I have a domain (no subdomains) that serves up different dynamic content for mobile/desktop pages--each having the exact same page url, kind of a semi responsive design, and will be using "Vary: User-Agent" to give Google a heads up on this setup. However, some of the pages are only valid for mobile or only valid for desktop.  In the case of when a page is valid only for mobile (call it mysite.com/mobile-page-only ), Google Webmaster Tools is giving me a soft 404 error under Desktop, saying that the page does not exist, Apparently it is doing that because my program is actually redirecting the user/crawler to the home page.  It appears from the info about soft 404 errors that Google is saying since it "doesn't exist" I should give the user a 404 page--which I can make it customized and give the user an option to go to the home page, or choose links from a menu, etc.. My concern is that if I tell the desktop bot that mysite.com/mobile-page-only basically is a 404 error (ie doesn't exist), that it could mess up the mobile bot indexing for that page--since it definitely DOES exist for mobile users.. Does anyone here know for sure that Google will index a page for mobile that is a 404 not found for desktop and vice versa?  Obviously it is important to not remove something from an index in which it belongs, so whether Google is careful to differential the two is a very important issue.  Has anybody here dealt with this or seen anything from Google that addresses it?  Might one be better off leaving it as a soft 404 error? EDIT: also, what about Bing and Yahoo?  Can we assume they will handle it the same way? EDIT: closely related question--in a case like mine does Google need a separate sitemap for the valid mobile pages and valid desktop pages even though most links will be in both?  I can't tell from reading several q&a on this. Thanks, Ted

    Intermediate & Advanced SEO | | friendoffood
    0

  • Hi all A query has recently been raised internally with regard to the use of canonical links. Due to CMS limitations with a client who's CMS is managed by a third party agency, canonical links are currently output with the port number attributed, e.g. example.com/page:80 ...as opposed to the correct absolute URL: example.com/page Note port number are not attributed to the actual page URLs. We have been advised that this canonical link functionality cannot be amended at present. My personal interpretation of canonical link requirements is that such a link should exactly match the absolute URL of the intended destination page, my query is does this extend to the attribution of port number to URLs. Is the likely impact of the inclusion of such potentially incorrect URLs likely to be the same as purely incorrect canonical links. Thanks

    Intermediate & Advanced SEO | | 26ryan
    0

  • Hi Anyone else having problems with Google's Pagespeed tool? I am trying to benchmark a couple of my sites but, according to Google, my sites are not loading. They will work when I run them through the test at one point but if I try again, say 15 mins later, they will present the following error message An error has occured DNS error while resolving DOMAIN. Check the spelling of the host, and ensure that the page is accessible from the public Internet. You may refresh to try again. If the problem persists, please visit the PageSpeed Insights mailing list for support. This isn't too much an issue for testing page speed but am concerned that if Google is getting this error on the speed test it will also get the error when trying to crawl and index the pages. I can confirm the sites are up and running. I the sites are pointed at the server via A-records and haven't been changed for many weeks so cannot be a dns updating issue.  Am at a loss to explain. Any advice would be most welcome. Thanks.

    Technical SEO | | daedriccarl
    0

  • I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?

    White Hat / Black Hat SEO | | Choice
    0

  • We had a discussion about the importance of 404-errors as result of products which are out of stock. Of course this is not good, but what is the leverance in terms of importance: low-medium-high?

    Technical SEO | | Digital-DMG
    0

  • We have a medium/large ecommerce site that imports manufacturer products every year (or when new products come in/out)  We are trying to decide what to do with the discontinued product pages. As we are using shopify we do not have an option of custom404 error pages so we cannot use this. We also cannot do a 30 redirect with a custom message as to why they are being redirected so dont like that idea. What we were thinking of doing was leaving the page with its content and adding a message that the  item has been discontinued and a few similar products listed below with an option of clicking on a link to go up a level to the category/subcategory of that products brand. My question is: Should we noindex/follow these pages when they go out of stock so search engines don't continue to index them. Should we add the tag: (we do not have advance warning so it would be at the time that we update the listing to say item is no longer available) My concern with doing the above and leaving it to be indexed is that google may regard these pages as soft 404 if the bounce rate gets very high - as users will not be staying very long on the page. Any advice would be very much appreciated!

    Algorithm Updates | | henya
    0

  • For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?

    Intermediate & Advanced SEO | | kdaniels
    0

  • Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.

    Technical SEO | | ashishb01
    0

  • My company has a global online community of over 350,000+ members.  We allow users to submit articles revolving around the scrum and agile framework.  Sometimes the submissions become overwhelming and I am curious if there is anyone out there who may have experience with this same scenario. I don't want to place meta data on every single article that comes in, because I don't have the resources nor time to optimize each and every article.  Does anyone have techniques or suggestions in regards to either leaving the meta data untouched or customizing each individual piece? Any thoughts or ideas will be extremely helpful. Thanks

    SEO Learn Center | | ScrumAlliance
    0

  • I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.

    Technical SEO | | MTalhaImtiaz
    0

  • Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
    Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
    http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
    Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
    http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.

    Intermediate & Advanced SEO | | TjeerdvZ
    0

  • Hi, how are you?
    I have my website with Google Tag Manager implemented. Google Analytics is one of the tags I have configured. The problem is that Referral exclusions aren't working for me. I excluded 3 domains, and I keep seeing them in my Referral Traffic report.
    Any idea why this could be happening? Best wishes,
    Ariel g04tIqv IrzKXiY

    Reporting & Analytics | | arielbortz
    0

  • We have a medium size site that lost more than 50% of its traffic in July 2013 just before the Panda rollout. After working with a SEO agency, we were advised to clean up various items, one of them being that the 10k+ urls were all mixed case (i.e. www.example.com/Blue-Widget). A 301 redirect was set up thereafter forcing all these urls to go to a lowercase version (i.e. www.example.com/blue-widget). In addition, there was a canonical tag placed on all of these pages in case any parameters or other characters were incorporated into a url. I thought this was a good set up, but when running a SEO audit through a third party tool, it shows me the massive amount of 301 redirects. And, now I wonder if there should only be a canonical without the redirect or if its okay to have tens of thousands 301 redirects on the site. We have not recovered yet from the traffic loss yet and we are wondering if its really more of a technical problem than a Google penalty. Guidance and advise from those experienced in the industry is appreciated.

    Intermediate & Advanced SEO | | ABK717
    0

  • Hi Folks, Apologies for the newbie question here, but hope any of the experts here can help. Our blog resides on a subdomain, and our blog increasingly plays an important role in driving traffic to our site. However the subdomain does not seem to show up in our keyword rankings report on Moz pro. Does this report only include URLs on the main site? Is there a way to monitor how our blog ranks on our keywords somewhere else? Thanks in advance all! Regards Patrick

    Getting Started | | patrickqureshi
    0

  • I know this is a weird question, I think I have confused myself with different keyword tools. So if you get a score of 10 for your keyword, should you aim to be closer to 1 or 50?

    Moz Bar | | ejunxion
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.