Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello,  Do these meta tags have any current usage? <meta name="author" content="Author Name"><meta name="publisher" content="Publisher Name"> I have also seen this usage linking to a companies Google+ Page:Thank you

    Jul 12, 2017, 2:16 PM | srbello
    0

  • Hi everyone, I have a problem with a website wherein all URLs (homepage, inner pages) are 302 redirected. This is based on Screaming Frog crawl. But the weird thing is that they are 302 redirected to themselves which doesn't make any sense. Example:
    https://www.example.com.au/ is 302 redirected to https://www.example.com.au/ https://www.example.com.au/shop is 302 redirected to https://www.example.com.au/shop https://www.example.com.au/shop/dresses is 302 redirected to https://www.example.com.au/shop/dresses Have you encountered this issue? What did you do to fix it? Would be very glad to hear your responses. Cheers!

    Jul 10, 2017, 4:03 PM | alex_goldman
    0

  • Good Morning Moz peeps, I am new to this but intending on starting off right! I have heard a wealth of advice that the "post name" permalink structure is the best one to go with however... i am wondering about a "custom structure" combing the "post name" following the below example structure: Www.professionalwarrior.com/bodybuilding/%postname/ Where "professional" and "bodybuilding" is my focus/theme/keywords of my blog that i want ranked. Thanks a mill, RO

    Jun 30, 2017, 10:11 AM | RawkingOut
    0

  • Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
    If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen

    Jun 29, 2017, 12:12 PM | chalet
    0

  • Hello members. I have a question that I am seeking to confirm whether or not I am on the right track. I am interested in purchasing a .ly domain which is the ccTLD for Libya. The purpose of the .ly domain would be for branding purposes however at the same time I do not want to kill the websites ability to rank in Google.com (United States searches) because of this domain. Google does not consider .ly to be one of those generic ccTLDs like. io, .cc, .co, etc. that can rank and Bitly has also moved away from the .ly extension to a .com extension. Back in 2011 when there was unrest in Lybia, a few well known sites that utilized the .ly extension had their domains confiscated such as Letter.ly, Advers.ly and I think Bitly may have been on that list too however with the unrest behind us it is possible to purchase a .ly so being able to obtain one is not an issue. From what I can tell, I should be able to specify in Google Search Console that the website utilizing the .ly extension is a US based website. I can also do this with Google My Business and I will keep the Whois info public so the whois data can been seen as a US based website. Based on everything I just said do any of you think I will be OK if I were to register and use the .ly domain extension and still be able to rank in Google.com (US Searches). Confirmation would help me sleep better. Thanks in advance everyone and have a great day!!

    Jun 28, 2017, 5:34 PM | joemaresca
    0

  • Hi just wondering i'm using the same image across 20 pages which are optimized for SEO purposes. I was wondering is there issues with this from SEO standpoint? Will Google devalue the page because the same image is being used? Cheers.

    Jun 27, 2017, 8:01 PM | seowork214
    0

  • So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!

    Jun 27, 2017, 7:08 PM | MJTrevens
    1

  • Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike

    Jun 27, 2017, 2:16 PM | 94501
    0

  • Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.

    Jun 23, 2017, 2:30 AM | CarolynSC
    0

  • I'm thinking of using rel=canonical for similar products on my site. Say I'm selling pens and they are al very similar. I.e. a big pen in blue, a pack of 5 blue bic pens, a pack of 10, 50, 100 etc. should I rel=canonical them all to the best seller as its almost impossible to make the pages unique. (I realise the best I realise these should be attributes and not products but I'm sure you get my point) It seems sensible to have one master canonical page for bic pens on a site that has a great description video content and good images plus linked articles etc rather than loads of duplicate looking pages. love to hear thoughts from the Moz community.

    Jun 20, 2017, 1:50 PM | mark_baird
    0

  • I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?

    Jun 19, 2017, 3:00 PM | McTaggart
    0

  • Hi, We seem to have a slightly odd issue. We noticed that a number of our location category pages were slipping off 1 page, and onto page 2 in our niche. On inspection, we noticed that our Arizona page had started ranking in place of a number of other location pages - Cali, Idaho, NJ etc. Weirdly, the pages they had replaced were no longer indexed, and would remain so, despite being fetched, tweeted etc. One test was to see when the dropped out pages had been last crawled, or at least cached. When conducting the 'cache:domain.com/category/location' on these pages, we were getting 301 redirected to, you guessed it, the Arizona page. Very odd. However, the dropped out pages were serving 200 OK when run through header checker tools, screaming frog etc. On the face of it, it would seem Googlebot is getting redirected when it is hitting a number of our key location pages, but users are not. Has anyone experienced anything like this? The theming of the pages are quite different in terms of content, meta etc. Thanks.

    Jun 13, 2017, 5:55 AM | Sayers
    0

  • I was doing some KW research for a client and noticed something interesting with regard to Yelp and Justia. For a search on DWI Attorneys, they each had over 300 character meta descriptions showing on the SERP without truncating. Everyone else was either truncated or within limit of roughly 160 characters. Obviously if there is a way to get something other than a list to show that way you can own some real estate. Would love to hear from some of you Mozzers on this. Here are two images that should assist. Best Edit: I found one that was not a directory site and it appears it is Google doing it. The site has no meta description for the home page and this is what is being pulled by Google. There are 327 characters here! The truncation marks are showing it being pulled from different parts of the page. Image is Killeen DWI Attorney. NOTE None of these are clients, etc. I also changed the cities so this is a general search. zAQpA qZ9KI 06p7U

    Jun 12, 2017, 4:02 PM | RobertFisher
    1

  • We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented. So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed? Trying to find a clear answer to this and have yet to find a good answer

    Jun 12, 2017, 12:24 PM | MJTrevens
    0

  • This might seem like a silly question, but It's one that I would like to get some responses from the SEO community. Do <h>tags need to be staggered according to the numbers? For example: A few of our clients have their h1 tag listed on a mid-way header that is halfway down their page, and there are both h2's and h3's listed before the h1 in the source code. Does this matter? Let me know! 
    Thanks! </h>

    Jun 1, 2017, 3:23 PM | TaylorRHawkins
    2

  • My search visibility on here went from 3.5% to 3.7% to 0% to 0.03% and now 0.05% in a matter of 1 month and I do not know why. I make changes every week to see if I can get higher on google results. I do well with one website which is for a medical office that has been open for years. This new one where the office has only been open a few months I am having trouble. We aren't getting calls like I am hoping we would. In fact the only one we did receive I believe is because we were closest to him in proximity on google maps. I am also having some trouble with the "Links" aspect of SEO.  Everywhere I see to get linked it seems you have to pay. We are a medical office we aren't selling products so not many Blogs would want to talk about us. Any help that could assist me with getting a higher rank on google would be greatly appreciated. Also any help with getting the search visibility up would be great as well.

    May 23, 2017, 2:33 PM | benjaminleemd
    1

  • Hi, So starting about March 9 I started seeing huge losses in ranking for a client. These rankings continue to drop every week since and we changed nothing on the site. At first I thought it must be the FRED update, so we have started rewriting and adding product descriptions to our pages (which is a good thing regardless). I also checked our backlink profile using OSE on MOZ and still saw the few linking root domains we had. Another Odd thing on this is that webmasters tools showed many more domains. So today I bought a subscriptions to ahrefs and instantly saw that on the same timeline (starting March 1 2017) until now, we have literally doubled in inbound links from very spammy type sites. BUT the incoming links are not to content, people seem to be ripping off our images. So my question is, do spammy inbound image links count against us the same as if someone linked actual written content or non image urls? Is FRED something I should still be looking into? Should i disavow a list of inbound image links? Thanks in advance!

    May 23, 2017, 5:42 AM | plahpoy
    0

  • Hi Moz Community, I have a client using relative links for their canonicals (vs. absolute) Google appears to be following this just fine, but bing, etc. are still sending organic traffic to the non-canonical links. It's a drupal setup. Anyone have advice?  Should I recommend that all canonical links be absolute?  They are strapped for resources, so this would be a PITA if it won't make a difference. Thanks

    May 22, 2017, 11:52 AM | SimpleSearch
    1

  • Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
    User-agent: *
    Disallow: /admin
    Disallow: /cart
    Disallow: /orders
    Disallow: /checkout
    Disallow: /9545580/checkouts
    Disallow: /carts
    Disallow: /account
    Disallow: /collections/+
    Disallow: /collections/%2B
    Disallow: /collections/%2b
    Disallow: /blogs/+
    Disallow: /blogs/%2B
    Disallow: /blogs/%2b
    Disallow: /design_theme_id
    Disallow: /preview_theme_id
    Disallow: /preview_script_id
    Disallow: /apple-app-site-association
    Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,

    May 22, 2017, 6:39 AM | Ben-R
    0

  • Hi Mozzers - was just wondering whether matching H1 and Title tags are still OK, or whether there's an over optimization risk if they exact match?

    May 19, 2017, 4:48 PM | McTaggart
    0

  • Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
    http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
    http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?

    May 18, 2017, 9:30 AM | alexrbrg
    0

  • We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.

    May 15, 2017, 9:02 AM | teddef
    0

  • Can I have the root domain pointing to one server and other URLs on the domain pointing to another server without redirecting, domain masking or HTML masking? Dealing with an old site that is a mess. I want to avoid migrating the old website to the new environment. I want to work on a page by page and section by section basis, and whatever gets ready to go live I will release on the new server while keeping all other pages untouched and live on the old server. What are your recommendations?

    May 9, 2017, 8:58 AM | Joseph-Green-SEO
    0

  • We created a more keyword friendly url with dashes instead of underscores in December.  That new URL is in Google's Index and has a few links to it naturally.  The previous version of the URL (with underscores) continues to rear it's ugly head in the SERPs, though when you click on it you are 301'd to the new url.  The 301 is implemented correctly and checked out on sites such as http://www.redirect-checker.org/index.php. Has anyone else experienced such a thing? I understand that Google can use it's discretion on pages, title tags, canonicals, etc.... But I've never witnessed them continue to show an old url that has been 301'd to a new for months after discovery or randomly.

    May 3, 2017, 2:11 PM | seoaustin
    0

  • I am working with a website that sells new and multiple grades of refurbished power tools New Refurbished Grade A (top quality refurbished) Refurbished Grade C (had a few more scuffs but in perfect working order) Refurbished Grade D (no warranty / as is conditions, typically for parts) How would you create the Products and URL structure? Since they are all technically different products they have their own sku in magento. Would you combine them into one URL with different product options? or would you give each product version its own url (New, Grade A, Grade C, Grade D) Thanks! -- Steven

    Apr 26, 2017, 3:26 PM | intown
    0

  • We're taking on a redesign of our corporate site on our main domain.  We also have a number of well established, product based subdomains. There are a number of content pages that currently live on the corporate site that rank well, and bring in a great deal of traffic, though we are considering placing 301 redirects in place to point that traffic to the appropriate pages on the subdomains. If redirected correctly, can we expect the SEO value of the content pages currently living on the corporate site to transfer to the subdomains, or will we be negatively impacting our SEO by transferring this content from one domain to multiple subdomains?

    Apr 26, 2017, 9:12 AM | Chris8198
    0

  • Will Google value a link with a UTM tag the same as a clean link without a UTM tag? I should say that a UTM tag link is not a natural link so the linkvalue is zero. Anyone any idea how to look at this?

    Apr 25, 2017, 11:32 AM | TT_Vakantiehuizen
    0

  • Hi, I'm integrating with a service that adds 3rd-party images/videos (owned by them, hosted on their server) to my site. For instance, the service might have tons of pictures/videos of cars; and then when I integrate, I can show my users these pictures/videos about cars I might be selling. But I'm wondering how to build out the sitemap--I would like to include reference to these images/videos, so Google knows I'm using lots of multimedia. How's the most white-hat way to do that? Can I add external links to my sitemap pointing to these images/videos hosted on a different server, or is that frowned upon? Thanks in advance.

    Apr 21, 2017, 1:27 PM | SEOdub
    0

  • Hi, We are developing new Products Pages with faceted filters. You can see it here: https://www.viatrading.com/wholesale-products/ We have a feature allowing to Order By and Group By, which alters the order of all products. There will also be the option to view Products as a table, which will contain same products but with different design and maybe slightly different content of each product. All this will happen without changing the URL, https://www.viatrading.com/all/ Is this the best practice? Thanks,

    Apr 14, 2017, 5:25 PM | viatrading1
    0

  • Hello, Getting a lot of duplicate title and meta description errors via google webmaster tools. For best SEO practices, do i no-index the page/2's, page/3's...? More importantly, i see how MOZ did it by adding "page 3" to their titles such as http://moz.com/blog?page=3.  Is that a better way of doing it?  If so, how do i do that on Yoast SEO? Thank you so much!

    Apr 3, 2017, 4:41 AM | Shawn124
    0

  • We're a business with 5 separate locations across 5 cities in Upstate NY.  While doing some visual ad previews in the adwords interface I noticed that Google is altering my title tag and adding the word "Rochester" to the end of it, cutting short my designated title tag. Rochester is the location of our headquarters so not a big deal for 1/5th of our customers.  But to my dismay, the same thing is happening when searching from the geo locations of my other branches.  So when searching for my business in Buffalo (we have a physical address in Buffalo), the title tag in the results still says our company name and "Rochester". This of course is likely leading to confusion and actively harming our organic CTR in our branch locations.  This is happening in all of the remaining 4 branch locations.  I'm at a loss, I tried lengthening the title tag but it still gets cut off. The term Rochester appears (as do the other branch locations) in my meta description for the homepage as well as in the text of the page itself.  I haven't gone so far as to remove that yet and hopefully don't have to. Does anyone have any ideas?  Thank you in advance!

    Mar 31, 2017, 12:20 AM | Doylejg3
    0

  • I'm planning to set up a subdomain for my Shopify store but I'm not sure if this is the right approach. Should I purchase a separate domain for it? I'm running Wordpress on my website and want to keep it that way. I want to use Shopify for the ecommerce side. I want to link the store from the top nav and of course I'll use CTA's in a variety of ways to point to merchandise and other things on the store side.   Thanks for any help you can offer.

    Mar 28, 2017, 3:50 PM | ims2016
    0

  • Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
    Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.

    Mar 27, 2017, 10:36 AM | jayoliverwright
    0

  • I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!

    Mar 26, 2017, 3:34 AM | HashtagJeff
    0

  • Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke

    Mar 21, 2017, 10:39 AM | McTaggart
    0

  • Is it advisable to use only one H2 tag? The template designs for some reason is ended up with multiple H2 tags, I realise if any think it's that each one is that are important and it is all relative. Just trying to assess if it's worth the time and effort to rehash the template. Has anyone done any testing or got any experience? Thanks

    Mar 21, 2017, 8:56 AM | seoman10
    1

  • Hi Mozzers, I'm thinking republishing content from my own website's blog on platforms like LinkedIn and Medium. These sites are able to reach a far bigger (relevant) audience than I can through my own website, so there's strategic reasoning for doing this. However, with SEO being a key activity on my own website, I don't want to be at risk of any penalties for duplicate content. However, I've just read this on Search Engine Journal: "there is confirmation from Google... Gary Illyes has stated that republishing articles won’t cause a penalty, and that it’s simply a filter they use when evaluating sites. Most sites are only penalized for duplicate content if the site is 100% copied content." So, what do people think - is republishing blog content, on LinkedIn and Medium safe? And is it a sound tactic to increase reach?

    Mar 15, 2017, 11:34 PM | Zoope
    0

  • Hello Mozzers Would you use rel=canonical, robots.txt, or Google Webmaster Tools to stop the search engines indexing URLs that include query strings/parameters. Or perhaps a combination? I guess it would be a good idea to stop the search engines crawling these URLs because the content they display will tend to be duplicate content  and of low value to users. I would be tempted to use a combination of canonicalization and robots.txt for every page I do not want crawled or indexed, yet perhaps Google Webmaster Tools is the best way to go / just as effective??? And I suppose some use meta robots tags too. Does Google take a position on being blocked from web pages. Thanks in advance, Luke

    Mar 15, 2017, 9:04 PM | McTaggart
    0

  • When creating a redirect map for a site re-build or domain change, it is necessary to include .PDFs or any other non-HTML URLs? Do PDFs even carry "seo juice" over? When switching CMS, does it even matter to include them? Thanks!

    Mar 13, 2017, 4:29 PM | emilydavidson
    0

  • I'm not sure but my website looks like is not getting it's juice as supposed to be. As we already know, google preferred https sites and this is what happened to mine, it was been crawling as https but when the time came to move my domain to new domain, I used 301 or domain forwarding service, unfortunately they didn't have a way to forward from https to new https, they only had regular http to https, when users clicked to my old domain from google search my site was returned to "site does not exist", I used hreflang at least that google would detect my new domain been forwarding and yes it worked but now I'm wondering, for how much time should I keep the forwarding the old domain to the new one, my site looks like is not going up, I have changed all the external links, any help would be appreciated. Thanks!

    Mar 13, 2017, 4:00 PM | Fulanito
    1

  • Hi All, I noticed that our website has lot of 403 errors across different pages using the tool http://www.deadlinkchecker.com/. Do these errors hurt website rankings? Thanks

    Mar 8, 2017, 2:33 PM | vtmoz
    0

  • I see that the gap uses gap.com, oldnavy.gap.com and bananarepublic.gap.com.  Wouldn't a better approach for SEO to have oldnavy.com, bananarepublic.com and gap.com all separate?  Is there any benefit to using the approach of store1.parentcompany.com, store2.parentcompany.com etc?  What are the pros and cons to each?

    Mar 7, 2017, 1:20 AM | kcb8178
    0

  • Our website CMS is wordpress. Due to the Genesis Framework; below 4 phrases tuned into h2 tags: Skip links, Header Right, Main navigation and Footer. How to remove these?

    Mar 2, 2017, 3:39 AM | vtmoz
    0

  • Can we have organization markup schema for subdomain ? For example if my main domain is xyz.com and subdomain is sub.xyz.com If i plan to have organization markup schema for subdomain how should it look like ? Should the markup schema must have main domain url or sub domain url in markup schema ? Should it be like this ?

    Mar 1, 2017, 4:16 PM | NortonSupportSEO
    0

  • Hi all, We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles". Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO? Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase? Thanks,

    Mar 1, 2017, 11:58 AM | vtmoz
    0

  • I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?

    Feb 28, 2017, 12:24 PM | Tylerj
    0

  • I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.

    Feb 28, 2017, 3:58 AM | odihost
    0

  • Hi there! We are currently evaluating data visualization / charting tools for rich content. Are there any open source solutions that work best in your opinion? Why? Some specific questions: Are static image / svg rendered images better than a javascript dynamic chart (canvas/HTML5)? Which gets indexed better? Is there any proven or perceived benefit to using Google Charts API that gives you an SEO boost? Are there tools for progressively enhancing HTML raw data tables to generate charts? Looking at a couple of solutions: Google Charts API C3.js Chartjs Thanks for your feedback!

    Feb 27, 2017, 5:40 PM | insurifyusa
    0

  • Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
    .com/assessment/straightening
    .com/treatment/whitening
    .com/treatment/straightening or .com/whitening/assessment
    .com/straightening/assessment
    .com/whitening/treatment
    .com/straightening/treatment Please advise, thanks.

    Feb 26, 2017, 9:25 PM | Bee159
    0

  • I'm getting a "Your page is not mobile-friendly." notice in the SERPs for all of our PDFs.  I check the pdf on the phone and it appears just fine. rFtLq

    Feb 24, 2017, 1:31 PM | johnnybgunn
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.