Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • So this is a bit of a strange one. My latest website was built on a different domain, then transferred over (as opposed to being built on a subdomain). I was told that the domain which my site was built on wasn't indexed by Google, but looking at the Google Search Console I can see that the old domain name is showing up as the most linked to domain name of my current site - meaning it was indexed. The domain (and all of its pages) does have a 301 redirect to the new website home page (as opposed to their individual pages), but could this be causing me a problem with SEO? Additionally, my website has a sister (UK and US websites), both link to each other on the footer (which appears on every page). Could this be pulling my SEO efforts down if it is a do-follow link?

    Intermediate & Advanced SEO | | moon-boots
    0

  • My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!

    Intermediate & Advanced SEO | | accpar
    0

  • I have a client who has a lot of domain variations, which have all been set up in Google Search Console. I requested that the client use the COA feature in GSC for the domains that are now redirecting to other domains that they own (which are set up in GSC). The problem is that we're not redirecting the homepages to the homepages of the destination domains. So, GSC is giving us this error message: fails redirection test: The old site redirects to www.domain.com/blog, which does not correspond to the new site you chose. Is our only way to use GSC COA for these domains to change the homepage redirect to go to the homepage of the destination domain? We don't really want that since the domain we're redirecting is a "blog.domain1.com" subdomain and we want to redirect it to "domain2.com/blog". Any help appreciated! Thanks,
    Dan

    Intermediate & Advanced SEO | | kernmedia
    0

  • I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages. My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https. Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page? Thanks! Best... Mike

    Intermediate & Advanced SEO | | 94501
    1

  • It is telling me "we were unable to validate the following listings with google or facebook", but the information matches my google my business listing exactly (for as much as I can tell) why won't it let me validate? I know I have a google my business listing as I am the manager of it and i synced it to moz local via google my business. So why is it not validating when i know for a fact it is listed in Google? Sharper Impressions Painting Inc 6292 Valleyview Dr, Fishers, IN 46038  (317)423-0230. My other 4 sharper impressions listings were validated (even the kansas city one that is still waiting to be verified on google my business) so I am at a loss as to why this one will not validate. Please advise.

    Moz Local | | LiL_C-bus
    0

  • What's the correct method for tagging dupe content between country based subdomains? We have: mydomain.com // default, en-us www.mydomain.com // en-us uk.mydomain.com // uk, en-gb au.mydomain.com // australia, en-au eu.mydomain.com // europe, en-eu In the header of each we currently have rel="alternate" tags but we're still getting dupe content warnings in Moz for the "WWW" subdomain. Question 1) Are we headed in the right direction with using alternate? Or would it be better to use canonical since the languages are technically all English, just different regions. The content is pretty much the same minus currency and localization differences. Question 2) How can we solve the dupe content between WWW and the base domain, since the above isn't working. Thanks so much

    Technical SEO | | lvdh1
    1

  • Hey guys, Our site targets multiple territories. We use subfolders and hreflang tags on the site (built in WordPress) at a page level. We've added our hreflang tags manually in the section of each page. We're just re-doing the blog and we want to know if we need to add these tags to each individual blog post and if we do, how we would do it? Our developers have put them in at blog landing page level and told us that this will be fine. E.g.: /de/blog/ /gb/blog/ /uc/blog/ They have a slight tendency to push back on things though, and we just want to be sure we're doing this right. Hreflang tags are sooooo complicated so hoping you fine people can shed some light on the issue. Cheers!

    Intermediate & Advanced SEO | | Twetman
    0

  • I'm sure most of you have heard about this startup, RankScience, that has big ambitions to disrupt the SEO industry with their automated (I know I know...the word 'automated' and 'SEO' in the same sentence!!!) optimization software. Their claim is that by running thousands of congruent A/B tests on your site, they can maximize rankings and organic traffic. Initially my thoughts were "oh crap, there goes my (and a lot of other people's) career". But then I started thinking about it a bit more and realized a couple things. First, software can't replace a face-to-face client meeting. Being in an agency world as most of us are, client interactions are vital to a sustained partnership. Second, someone is going to have to understand what this software does, configure it, and monitor it, and I'm ok with that being part of my job if that's how the industry shifts. Third, and most importantly, in theory this software has the capability to reverse engineer search algorithms. If they had the data of 10,000 websites using their platform and are collecting data on what works and what doesn't, it's only a matter of time before they can pick apart the algorithm piece by piece to figure out exactly how it works. Google is obviously not going to like that very much and will almost certainly right the ship. That's my 2 cents, looking forward to what your thoughts are on RankScience and the future of our industry.

    Algorithm Updates | | LoganRay
    2

  • Hi again, We recently had a technical search audit done by a specialist agency and they discovered a number of internal links that caused redirects to happen. The agency has recommended we update all of these links to link directly to the destination so we don't lose out on link equity. We'd just like to know if you think this would be a worthwhile use of our time. Our web team seem to think that returning a 301 to a crawler means that the crawler will stop indexing the original URL and instead index the redirected destination? Thanks all. Clair

    Intermediate & Advanced SEO | | iescape
    2

  • how moz collect data? and this is also collect from bing, ask etc. search engin?

    Moz Local | | ACRC
    0

  • Hi We need a bit of help from someone to fix the following issues causing speed issues on our website.Does anyone know of someone that can help? Reduce server response time Optimize images Eliminate render-blocking JavaScript and CSS in above-the-fold content Avoid landing page redirects Leverage browser caching Minify CSS Minify JavaScript Minify HTML

    Technical SEO | | Bev.Aquaspresso
    0

  • Hi guys, i'm wondering if I can get some best practice advice in preparation for launching our new e-commerce website. For the new website we are creating location pages with a description and things to do which will lead the user to hotels in the location. For each hotel page which relates to the location we will have the same 'Things to do' content. This is what the content will look like on each page: Location page Location title (1-3 words) Location description (150-200 words) Things to do (200-250 words) Reasons to visit location (15 words) Hotel page Hotel name and address (10 words) Short description (25 words) Reasons to book hotel (15 words) Hotel description (100-200 words) Friendly message why to visit (15 words) Hotel reviews feed from trust pilot Types of break and information (100-200 words) Things to do (200-250 words) My question is how much will we penalised for having the same 'Things to do' content on say up to 10 hotels + 1 location page? In an ideal world we want to develop a piece of code which tells search engines that the original content lies on the location page but this will not be possible before we go live. I'm unsure whether we should just go and take the potential loss in traffic or remove the 'Things to do' section on hotel pages until we develop the piece of code?

    Technical SEO | | CHGLTD
    1

  • When searching Condor Voucher on Google US where our business operates the SERP results show results on the first page that do not contain the keyword Voucher at all but instead Condor Coupons and Condor Promo Codes. Is this due to relevance of the other sites(higher search volume keyword), their domain authority, page authority, better authoritative content? Or does Google recognize that voucher is not often used in US and uses more common keywords such as Coupon and Promo Code? Therefore we cant rank for a term with such a low search volume?

    Keyword Research | | MyVoucherCodes
    0

  • Hello 🙂 I'm trying to figure out the category/taxonomy structure for my website which will be selling "Colored Contact Lenses" I'm a bit confused because, there are several search queries which sort of mean the same thing, for example "Halloween Contacts" are sort of the same thing as "Colored Contacts" people searching for Colored Contacts may potentially be looking for "Natural" styles, however many are looking for crazy styles, aka "Halloween Contacts" or "Crazy Contacts" Crazy contacts and halloween contacts, being the exact same thing just different choice of words from the searcher. So I'm trying to think of what to do for categories/link structure... I believe i should start with a primary category .com/colored-contacts/ then .com/colored-contacts/halloween-contacts/ But what about crazy contacts? Should I keep going, .com/colored-contacts/crazy-contacts/ which will have the exact same products listed? I'm kind if going crazy thinking about this lol, any thoughts and advice would be highly appreciated. Thank you!

    Technical SEO | | abuntlysupport
    0

  • We are a finance brokerage in Australia and we operate in a specialist niche and in regional areas with low competition but we have identified KW's that are very profitable to us but seem to need different approach re strategy. We specialise in Agribusiness lending. We have been pretty scrappy in the past with our SEO as it has always been done by me, and as a startup, as everyone knows, the jack of all trades can help and hinder! To date, we have done a lot of Adwords (and KW research) so I have a fair idea of what keywords I am after. Some KW are low competition and extremely profitable to us. But there is a difference between them on who our competitor is and how difficult it would be to rank and which strategy to use. For example Agribusiness, used by all major banks, now they provide agribusiness, but only via their own products, as we are brokers we tend to receive a lot of new leads as we are brokers and we can compare all products and as agribusiness can be quite complex this is a major point of difference for us. So my strategy to rank for this KW  would include a national approach as we provide advice in this space on a national scale, which has worked well via AdWords leads. But would like to move away from my sole reliance on AdWords. Then we move onto KW that we have also had some success on a national scale via Adwords but the metrics suggest is better from a local perspective (local regional town), i.e hobby farm loan, rural finance, even home loans (when there is no other local competitor in small town). As we have brokers in other regional towns this also opens up an opportunity to have either internal pages with lots of local signals (i.e NAP, Authority outbound links, local KW, social signals from local FB groups etc). But can a internal page compete against a competitors HP, for example I was going to set up mysite/Toowoomba.com.au internal page with info re that broker and lots of local points, or am I best to create another site, i.e brandname-Toowoomba.com.au (still linking from my contact us page for Toowoomba) and focus solely on local for this site (including internal pages to rank locally, i.e Toowoomba Home loans)? the extra benefit is I then create another asset if I was to sell the region as a franchise (another discussion) So, my question is, can I mix my strategies without any issues, or should I create separate sites?

    Local SEO | | AgLend
    0

  • Hi, Recently i removed some disallowed parameters from my robots.txt and added the setting No Url in my search console URL parameters tab (as can be seen in the image http://prntscr.com/e997o5) Today i saw the orderby parameter indexed even if the setting is to not crawl those urls. Anyone any idea why is this happening? Thank god that all those urls with parameters are canonicalised to their original url's.

    Technical SEO | | dos0659
    0

  • Where do I put the rel=canonical when the search.cfm (using URL re-write) page is the one and only page, just using url parameters to control sort, filter, view, etc. Do I just put the rel=canonical at the top of the search.cfm page?  The duplicate content issues I am getting are: https://www.domain.com/tx/austin/ https://www.domain.com/tx/austin/?d=25&h=&s=r&t=&v=l&a= Just want to be clear since Moz Pro is picking up both URL's but it's only really one file, search.cfm Thanks in advance for your help.

    Technical SEO | | ErnieB
    0

  • http://www.dickhead.hotel-online.com http://www.french.hotel-online.com http://www.sampsucks.hotel-online.com http://www.ru.hotel-online.com edit** adding http://www.travelocity.hotel-online.com and http://www.costar.hotel-online.com Also have the normal http://www.hotel-online.com backlink which i'll probably keep. It seems you can post press releases about hospitality stuff on this site, and found some with abcdefg.hotel-online.com was wondering if I should disavow them? I tried a google search for these, and none of them have a bad spam score according to moz, but also SEMrush with a low toxic score. These came up doing an semrush backlink crawl, but not a moz backlink crawl. Would it be okay to disavow these?

    Link Building | | donnieath
    0

  • Hello, I'm trying to rank my page for this keyword: "Rehab Royal Oak Michigan" But when you use that in a sentence, it just doesn't work. To make it grammatically correct I need to use "Rehab in Royal Oak Michigan" But now i'm using a stop word and i'm not sure if this counts it as a synonym or not - can someone help? Thanks!!

    Technical SEO | | Michael_Myles_Himself
    0

  • Site is too slow. I am seeing this new code more than 1000 times in my home page- What should I do now? My site- http://a1stlucie.com/

    On-Page Optimization | | Beachflower
    0

  • Hey community, This is our first topic on Moz, great to see the community growing and growing! We do have a question and can't figure out the reason(s). We hope you can help us. We are watching the "compare link metrics" for the domain https://www.voetbalwedden.net and see strange details. See our report on: https://moz.com/researchtools/ose/comparisons?site=https%3A%2F%2Fwww.voetbalwedden.net Page Specific Metrics
    Internal Equity-Passing Links = 0
    Total Internal Links = 0 Subdomain Metrics: Internal Equity-Passing Links = 0
    Total Internal Links = 0 We do have more than 30K pages indexed by Google, do have a great site structure and a well structured internal linking. We think these stats are not correct. Are we right? And what could we do to improve it? uUyQT

    Getting Started | | bettingfans.com
    0

  • Hi All, For my ecommerce site for mobile site category page I have redesign the page in 2 different ways so one original page and 2 new designs. Now I want to do A/B testing with google experiment. I want to measure performance of page via pageviews, bounce rate, exit rate, conversion rate, add to basket etc. Now in objective for this experiment I can select anyone thing either Pageviews, or bounces or transaction or goal. So my query is 1) I cannot select all objective together? 2) or for same page I have to create too many experiments by selecting each objective? 3) exit rate or add to basket objective is not in experiment list so? Thanks!

    Reporting & Analytics | | dsouzac
    0

  • Hi There, I have a client with EKM powershop. I am trying to figure out a way to either setup goal tracking (however I don't know what the final destination page is) or event tracking. I know I can do the ecommerce tag thing, but I am always worried about breaking the tracking code. Any ideas?

    Reporting & Analytics | | nezona
    0

  • Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks

    Intermediate & Advanced SEO | | vtmoz
    0

  • Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!

    Technical SEO | | BackPack85
    1

  • I monitor our positions for a few keywords but they are very unstable. One day they will be at position 1, 2 or 3 and then all of a sudden not even on Google. We use an IP redirect to guide users to the correct site and I suspect this may be a cause. Could anybody shed some light? Thanks

    Intermediate & Advanced SEO | | JordanRowlands
    0

  • Hi, SEO wizards! My company has a company blog on Medium (https://blog.scratchmm.com/). Recently, we decided to move it to our own site to drive more traffic to our domain (https://scratchmm.com/blog/). We re-published all Medium blogs to our own website. If we keep the Medium blog posts, will this be considered duplicate content and will our website rankings we affected in any way? Thank you!

    White Hat / Black Hat SEO | | Scratch_MM
    0

  • Picture this - if you have a spirit for adventure! Client builds Alpha****Domain.com Then builds a number of backlinks to Alpha****Domain.com Client also creates a number of 301 redirects from several older domains to AlphaDomain.com Client then changes Alpha****Domain.com to Beta****Domain.com They create 301 redirects from Alpha****Domain.com to Beta****Domain.com But then... they 'park' Alpha****Domain.com (ie. no longer accessible)! About one year later, client changes a whole bunch of URLs on Beta****Domain.com without keeping track of changes. Thankfully, the hosting service (Shopify) automatically creates some redirects, but it's more by accident than design! Questions: After step 6 above, are the 301 redirects created in steps 3 and 5 now totally redundant and broken? If AlphaDomain.com no longer exists, surely all redirects to and from this domain are broken? Or can they be recovered? What happens to all the backlinks originally created in step 2? Finally, can anything be done to recover lost URLs in step 7? Yes. What a mess!

    Intermediate & Advanced SEO | | muzzmoz
    0

  • Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.

    Technical SEO | | dos0659
    0

  • Hi guys, I am new to SEO and I have a question for you guys. We created a sitemap for our website. I was thinking of creating a sitemap link on our homepage. Do you think it's a good idea? Would this help us in terms of ranking improvements? Or would help with anything at all? Thanks

    Algorithm Updates | | ahmetkul
    0

  • Currently, the way our site is set up, our clients generally visit our homepage and then login through a separate page that is a subdomain, or they can read our blog/support articles that are also on separate subdomains. From my understanding, this can be counted as a bounce, and I know this sorta of site structure isn't ideal, but with our current dev resources and dependencies, fixing this isn't going to happen overnight. Regardless, what would be the easiest way to implement this fix witihn the Google Analytics code? EX: If someone visits our site at X.com, and then wants to login at portal.X.com, I don't want to count that as a bounce. Any insight is appreciated! Thanks

    Reporting & Analytics | | KathleenDC
    0

  • (Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
    http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
    http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm

    Moz Pro | | AspenFasteners
    0

  • In the case of an eCommerce store with a large catalogue of branded goods the inventory is constantly being adjusted as products become discontinued. Each year most fashion brands have 2 or 3 launches. At this same time they will delete some (not all) of previous years collections. Once we have sold through the remaining inventory of last season's products the question is how to proceed? a) delete products to avoid customers landing on page, then only to be disappointed when product is no longer available to purchase.. b) keep products however mark as discontinued / no longer available and show a link to a similar product if applicable.. I am coming around to the opinion that b) provides a better user experience. However will this growing catalogue of old products (pushed to bottom of category page) help keep content of site full and have SEO advantages? If this is the case then that helps confirm b) as best choice??

    Web Design | | seanmccauley
    0

  • I'm a home based business. I know Google has cracked down on PO boxes and UPS locations. What is one to do if they work from home but don't conduct business with clients from their home office? How can I get an address that's Google-approved?

    Reviews and Ratings | | FathomMarketing
    1

  • I have about 20,000 items on my Magento Site. I used a plugin to add structured data. If I use the Webmaster Tools Structured Data Tester everything shows up perfect. There are no errors on any page that I have spot checked. My total items increased to about 2500, but has now started dropping. The numbers have dropped to about 725 over the last few weeks. What can I check?

    Intermediate & Advanced SEO | | Tylerj
    0

  • Hi Mozzers, I'm seeing more and more companies improving their quality score by including information about their competitors on their website, when driving traffic from competitor brand terms. For example, for 'Yahoo Mail' related terms, Zoho drive traffic via an ad to this page: https://www.zoho.com/mail/yahoo-mail-alternative.html I'm planning a new campaign targeting competitor keywords and wondered what people think about this approach, and the legalities around talking about and comparing yourself to competitors on your own website?

    Paid Search Marketing | | Zoope
    0

  • We are trying to rank this domain: https://citychurchbloomington.org/ for this phrase churches in bloomington in. We recently had updated our domain name from citychurchfamily.org to citychurchbloomington.org because 1) it made sense for the organization, the end-user searching, and to help our rankings. Currently we are at position #4 on page 1 but have three sites ahead of us: churchfinder - in the last year this site came out of nowhere and slowly made its way up to the top spot high rock - this church had held spot #1 for many years and we've struggled to challenge it's place on the search results and are somewhat unclear why sherwood oaks - this site had been in spot #2 for many years and at times we've been able to challenge it's position but its held fairly tightly at spot #2 in the past, #3 since churchfinder rose up in the last year We've done competitive research and made some changes to our meta title, description, and h1 tag on our site but we're looking to make our next move to try and break into this top tier of results. I'm asking the community here for any insight/suggestions into what kind of move we should be exploring or making at this stage to move up. Sincerely, Andrew

    Local Website Optimization | | a_toohill
    0

  • A few months ago my client's website was hacked which created over 20,000+ spammy links on the site. I dealt with removing the malware and got google to remove the malware warning shortly within a week of the hacking. Then started the long process to do 301 redirects and disavowing links under Webmaster tools over these few months. The hacking only caused a slight drop in rankings at the time. Now just as of last week the site had a dramatic drop in rankings. When doing a keyword search I noticed the homepage doesn't even get listed on Google Maps and for Google Search instead the inner pages like the Contact Us page show up instead of the homepage. Does anyone have any insight to the sudden drop happening now and why the inner pages are ranking higher than the homepage now?

    Algorithm Updates | | FPK
    0

  • Hey, Folks! I Have Used 301 Redirection Method to Increase My Rankings but When i applied this Method My Website Ranked Down To 55 Numbers. Can anyone Suggest me How to Recover it?

    Technical SEO | | SumitJiGupta
    0

  • Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?

    Intermediate & Advanced SEO | | Adriaan.Multiply
    1

  • We have a glossary section on our website with hundreds of terms. At the moment we have it split into letters, e.g. there one page with all the terms starting with A, another for B etc.. I am conscious that this is not the best way to do things as not all of these pages are being indexed, and the traffic we get to these pages is very low. Any suggestions on what would be the best way to improve this? The 2 ideas I have at the moment are Have every term on a separate page, but ensuring there is enough copy for that term Leave as is, but have the URL change once a user scrolls down the page. E.g. the first page would be www.website.com/glossary/a/term-1 then once the user scrolls past this terms and onto the next one the URL would change to www.website.com/glossary/a/term-2

    Intermediate & Advanced SEO | | brian-madden
    0

  • I work with a client who is about to launch a local landing page for one of their locations. They're worried that the new local landing page will cannibalize some of the keyword rankings for the homepage. Any advice on how to have a local presence but still drive people to the more valuable homepage?

    Local Website Optimization | | jrridley
    0

  • Hi all, We have a lot of FAQ sections on our website, splitted in different places, depending on products, technologies, etc. If we want to optimize our content for Google's Featured Snippets, Voice Search and etc. - what is the best option: to combine them all in one FAQ section? or it doesn't matter for Google that this type of content is not in one place? Thank you!

    Algorithm Updates | | lgrozeva
    0

  • Hi all, I was wondering if it is possible for Google to diregard the canonical tag, if for example they decide it is wrongly put based on behavioural data. On the Natviscript Blog's individual blog posts there is a canonical tag for the www.nativescript.org/blog/details (printscreen - http://prntscr.com/e8kz5k). In my opinion it should not be there, and I've put request to our Engineering team for removal some time ago. Interestingly, all blog posts are indexed and got decent amount of organic traffic despite the tag. What do you think? Could it be that Google would disregard the tag based on usage data from let's say GA? Thanks, Lily

    Intermediate & Advanced SEO | | lgrozeva
    0

  • Hi Moz community, Our websites has dropped almost 50 positions for main keyword and Okay with other keywords. Other pages are doing consistent for other keywords. We haven't made any changes in website. What could be reason this ideal scenario of homepage dropping for main keyword. And recent unconfirmed algo update have anything do with this? Thnaks

    Algorithm Updates | | vtmoz
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.