Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Our site is secure but when I ask moz to crawl it by giving the root domain including https moz insists on crawling the non secure version.  How do i force it to crawl the secure version?

    Link Explorer | | media1234
    0

  • I just did SEO audits of approx 50 websites  in the tourism sector. Nearly all had poor Google Pagespeed ratings, partly down to that,  among other factors. I also feel that slideshows,, large images and videos in headers are poor for usability. I say get the content people need to engage with in front of them asap  Are there any stats or studies that can provide insight on this? I've been  telling those with these designs to  keep an eye on bounce rates and let that guide them

    Web Design | | anndonnelly
    0

  • We have a number of historical domain names that we are thinking of 301 redirecting to industry relevant domains.
    Currently the domains we wish to redirect are not active and have been down since march 2018.
    As far as we know there is no bad reputation on these domains, but we think there are still links out there in the wild on possibly relevant blog posts. Would there be any negative affect on the target domain? Thanks

    Technical SEO | | barry.oneil
    0

  • Hi, all Wondering if someone could give me a pointer or two, please. I cannot seem to get Google or Bing to crawl my sitemap. If I submit the sitemap in WMT and test it I get a report saying 44,322urls found. However, if I then submit that same sitemap it either says Pending (in old WMT) or Couldn't fetch in the new version. This couldn't fetch is very puzzling as it had no issue fetching the map to test it. My other domains on the same server are fine, the problem is limited to this one site. I have tried several pages on the site using the Fetch as Google tool and they load without issue, however, try as I may, it will not fetch my sitemap. The sitemapindex.xml file won't even submit. I can confirm my sitemaps, although large, work fine, please see the following as an example (minus the spaces, of course, didn't want to submit and make it look like I was just trying to get a link) https:// digitalcatwalk .co.uk/sitemap.xml https:// digitalcatwalk .co.uk/sitemapindex.xml I would welcome any feedback anyone could offer on this, please. It's driving me mad trying to work out what is up. Many thanks, Jeff

    Intermediate & Advanced SEO | | wonkydogadmin
    0

  • I want to increase DA Pa for Our Company website Tandem NZ. How Can i Increase it?

    White Hat / Black Hat SEO | | Tandem-Digital
    0

  • Dear My Niche site attacked by malware on 1 st march 2018. Hacker inject a php file on my blogpage. Injected link like: mydomain.com/blog/dmy4xa.php? Then I scan My site by wordfence. Identifying all malware code.Then  manually clean whole site with database. My site is completely free from malware. and remove all malware link from webmaster tools. Even Block my blog page by robots.txt . But new malware link index every week. So i need to remove those link every week.  So this issue I decided to rebuild my site. Finally I rebuild my site another server. Then I flash my current server and migrate my site from those server on 10th january 2019 . I wait 1 month to deindex malware link. But new link are indexing every week. I discourage site for over 1 week and even delete site from google webmaster tools with all properties as well as verification file from server. Over 1 week , Link are showing. I feel boar to delete malware link every week. I need permanent solution. Please give me a perfect solution for this malware link index. Google index about 100 url .After that I clean my site with some tools.  My site was free from malware. But Ne

    Technical SEO | | Gfound123
    0

  • Client has 3 locations in NYC... 1 is on the east side
    2 of them are a block apart on the west side (52nd & 51st street). When you search the business name, you only see 2 of the 3 listings - 1 on the East side and 1 of the 2 on the West side.  On the West side only the one with more reviews shows up. The semi-hidden location still exists.  You can find it if you type in the exact full name of the branch (it has the brand name + Midtown West vs its neighbor a block away that's brand name + Hell's Kitchen).  Otherwise, it's invisible. The Hell's Kitchen location that appears has 3000+ reviews.  The hidden one (Midtown West) has only 250+ reviews.  In the past, all 3 would show up. How do we get all 3 to show up again, at the initial, zoomed out view?

    Reviews and Ratings | | jaimeurteaga
    0

  • Hi there, We recently changed our domain from .COM to .NET so that all our subdomains from external pages matched. Right now in Google Console we have our new .NET website being tracked, but in GA we are still tracking .COM. It is also causing issues with MOZ crawling our site because of the .COM/.NET discrepancy. My question is what is the best way to change our Google Analytics from .COM to .NET without losing historical data and what considerations do we need to change before implementing this? Our team was concerned that just downloading the old data would be too vast and it we wouldn't be able to continue manipulating it dynamically in GA. Thanks!!

    Reporting & Analytics | | cPanel-LLC.
    0

  • So I have been at this for years! I cannot get Google to improve the rank on my travel site. The site has great relevant content that is constantly updated, it's optimized, has good page speed, active on social media. I have added backlinks where I could. I changed domain name about 4 years ago which probably impacted my rankings at the time. MOZ just did a walk through with me and couldn't really suggest any improvements. I remain with a low domain authority and consistently place under my competitor on Google. Last resort is to buy backlinks through fiverr. Is that a big mistake? https://tamarindobeachinfo.com

    Branding | | artsp
    0

  • Hi, Moz is alerting us about URLs being too long, titles being too long, and duplicate content. These are mostly for pages with categories of content, or the author bio pages --pages that are included by default with the template we're using in HubSpot. So that there isn't a quick way to make these changes, and anyway the pages are low-value.So my question is, how much are these issues impacting our SEO and domain authority?If these are pages no one visits (like the category and author pages) is there any reason this would be harming our SEO efforts?Thanks!

    Moz Pro | | ASCI-Marketing
    0

  • Hi everyone, I found this 'anomaly' in my google search console data. As you can see from the picture the impressions skyrocketed in a few days and then sort-of plateaued. The clicks remained almost the same because the keyword is highly informational (Wikipedia takes all the traffic). The keyword has around 650 000 searches per month. Btw. We are not doing off-site SEO. I would like to hear your opinion on what could have caused this. A Google update maybe? 5lERKef

    Reporting & Analytics | | filip-balun
    1

  • Hi, My website would receive a lot of traffic and then I asked a SEO company who contacted me to do some work on the site. Since then, the site has dropped in the rankings and our traffic has dropped by like a lead balloon. Instead of receiving thousands of visitors, today we have received 10. I am finding that I have articles no longer in Google but they are in Yahoo. Here is an example of an article that was once popular https://www.in2town.co.uk/travel-advice/how-to-save-money-booking-a-holiday-through-a-travel-agent/ I have tried everything and now I do not know where to turn. I am not sure what they have done but everything has now failed. We have not been penalized and our hosting company have said they cannot find anything wrong. Due to this problem we have stopped writing articles and have spent all our time trying to work out what has gone wrong. If anyone can give me advice and point me in the right direction then I would be in your debt.

    Content Development | | travelmagazine
    0

  • Hi guys, I was wondering if responding to negative yelp reviews would in turn boost yelps seo ranking for my business name. I'm trying to direct attention away from yelp being that we've had little success having real clients get through the review filter. We unfortunately have nasty competition that seems to be leaving fake negative reviews incessantly.

    Local Listings | | Dsv
    2

  • So I have a site that I want to shut down http://vowrenewalsmaui.com and redirect to a dedicated Vow Renewals page I am making on this site here:  https://simplemauiwedding.net. My main question is: I don't want to lose all the authority of the pages and if I just redirect the site using my domain registrar's 301 redirect it will only redirect the main URL not all of the supporting pages, to my knowledge. How do I not lose all the authority of the supporting pages and still shut down the site and close down my site builder? I know if I leave the site up I can redirect all of the individual pages to corresponding pages on the other site, but I want to be done with it. Just trying to figure out if there is a better way than I know of. The domain is hosted through GoDaddy.

    Intermediate & Advanced SEO | | photoseo1
    0

  • Domain migration nightmare - what is wrong? Domain migration nightmare - what is wrong?

    Intermediate & Advanced SEO | | PSOM101
    0

  • Moz is throwing an error for our office pages (10 office pages in the format /office/location-1; /office/location-2 etc) but the content is different. How should we handle the canonical tag? Thanks

    On-Page Optimization | | AztekMedia
    0

  • Hi First of all I should say that I have error in the old webmaster not the new one. I have two WordPress blog in root and subfolder. Today I checked my webmaster and recognize that I had 100 errors (404) that found in few days ago. My root WordPress is OK but subfolder WordPress has error. Let me show you by example. http://example.com/subfolder/article15245 I had error for this page: http://example.com/article15245 looks like this subfolder deleted I checked my links, but all of them were OK and linked to the right URL. unfortunately this errors dont have "linked from" section

    On-Page Optimization | | meysamzare711236541
    0

  • Hi friends, Our website pages without slash are not redirecting to with slash and vice-versa. Both the versions are returning 200 response code. Both the versions are pointed to with slash URLs with rel-canonical tags. Is this right setup? Or we need to redirect  one another to slash or without slash versions? Thanks

    Algorithm Updates | | vtmoz
    0

  • Hi, I would like to redirect a small website that contains info about a specific project temporarily to a specific url about this project on my main website. Reason for this is that the small website doesn't contain accurate info anymore. We will adapt the content in the next few weeks and then remove the redirect again. Should I set up a 301 or a 302? Thanks

    Intermediate & Advanced SEO | | Mat_C
    1

  • We are a kids website with fun learning content dedicated to kids aged 6-14 yrs, now we want to start a blog page for parents with parenting tips and information useful to parents. For this, should we choose - a subdomain or subdirectory?

    SEO Learn Center | | Mocomi
    0

  • Is this questions people have about something. For example : for the keyword Title tag would a related topic would be the length because people look for the length of a title tag ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hello, In an informational query can the answer people are looking for have multiple intent or will it always have 1 intent ? For example New York, the intent is probably where ? On a longer query such as "Provence bike tour" what is the intent ? Where, what, Why, How to, when ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • For security reasons we are now routing traffic through an external firewall cum CDN. Our server and domain IPs remain the same, but any request is routed through an external IP, which then forwards the traffic. Would our rankings be affected because of IP changes? Thanks Sam

    Technical SEO | | samgold
    0

  • Best practice?? I have a client that wishes to get found for services in several towns across the UK. They only have 1 physical location I have so far created a blog ( i use easyblog) and put a list of these towns..then added TAGS with the town names (this means each TAG gets a URL too) ..also i need to then monitor in moz pro somehow. Alternatively i could create web pages with additional information and give the URL the town name....however i think the tags will help...any advice welcome.

    Local SEO | | CORSOLUTIONS
    1

  • Hi, I have a site where they have: Disallow: /*? Problem is we need the following indexed: ?utm_source=google_shopping What would the best solution be? I have read: User-agent: *
    Allow: ?utm_source=google_shopping
    Disallow: /*? Any ideas?

    Intermediate & Advanced SEO | | vetofunk
    0

  • At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
    1. Do these requests count towards crawl budgets?
    2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.

    Technical SEO | | rogier_slag
    0

  • We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!

    Intermediate & Advanced SEO | | Markbwc
    0

  • I want to launch a 2 tier campaign, I have been building high-quality backlinks to my site, that means sites relevant to my niche, quality content ( more than 700 words), high domain authority and so on. But making an in deep research I realized the half of my backlinks are almost invisible to Google and the other ones have just a few links and traffic. So I decided to take some actions Build 2 tiers links to my backlinks in order to make them visible Boots their social signals with the same goal in mind ( social shares, likes, comments and so on) So my questions are Does anyone have experience with that? What kind of results did you get in the past? Is this useful? Thanks and regards

    Social Media | | Roman-Delcarmen
    0

  • Hello Moz friends, I am new to the tool and I was wondering if anybody has a best practice for international markets. I used to work with a different tool before and handling international markets has definitely been a challenge for it. What is the best way to set up campaigns/ keyword lists? By country? By topic? How helpful is the keyword explorer and reporting for international markets? I really appreciate your help.

    International SEO | | LisaGerecht
    0

  • Hello, I have a question about meta robots ="index, follow" and rel=canonical on category page pagination. Should the sorted page be <meta name="robots" content="index,follow"></meta name="robots" content="index,follow"> since the  rel="canonical" is pointing to a separate page that is different from the URL? Any thoughts on this topic would be awesome. Thanks. Main Category Page
    https://www.site.com/category/
    <meta name="robots" content="index,follow"><link rel="canonical" href="https: www.site.com="" category="" "=""></link rel="canonical" href="https:></meta name="robots" content="index,follow"> Sorted Page
    https://www.site.com/category/?p=2&dir=asc&order=name
    <meta name="robots" content="index, follow"=""><link rel="canonical" href="https: www.site.com="" category="" ?p="2""></link rel="canonical" href="https:></meta name="robots" content="index,> As you can see, the meta robots is telling Google to index https://www.site.com/category/?p=2&dir=asc&order=name , yet saying the canonical page is https://www.site.com/category/?p=2 .

    Intermediate & Advanced SEO | | Choice
    0

  • Hi everyone, So i have one project where I'm planning to move current content on new domain, two reasons: 1. It seems current domain has some google penalty (backlink related, not manual) 2. Client wants rebranding and already has domain with new brand name. So as content is high quality and there is no content related penalty from google, what would be the best way to migrate existing content without passing any penalty AND without Google treating it as duplicate content. If i do 301 i suspect any penalty there is might follow, if i just copy existing content it won't be original content, what is the best solution here? Thanks

    Intermediate & Advanced SEO | | joelsemy
    0

  • I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
    The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
    when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!

    Web Design | | CurtisB
    0

  • I have a client with multiple business addresses - 3 across 3 states, from an SEO perspective what would be the best approach for displaying a NAP on the website? So far I've read that its best: to get 3 GMB account to point to 3 location pages & use a local phone number as opposed to a 1300 number. Display all 3 locations in the footer, run of site

    Local Website Optimization | | jasongmcmahon
    1

  • I have a client that just migrated to salesforce commerce cloud. They have a large amount of 410 pages showing up in site crawls. They are all of the /on/demandware.store/ variety. Is there any reason to redirect thousands of these pages? Any value? Or should they be allowed to drop off?

    On-Page Optimization | | bdcseo
    1

  • Hi everyone, Here is my problem. This site: https://247ride.com/town-cars/ ranks for bunch of really good keywords. Such as lax car service, car service to lax etc. The keyword does not appear in Title tag, and only partially on meta description. The site's DA is 23 and PA is 22. Less than 29 links overall, and linking domains 8. Why in your opinion Google is ranking this page #1 and #2 for many competitive keywords. I know it's a hard question to ask but any input would be greatly appreciated. I am working really hard to rank for same keywords and  so far i am at #11 position. Thanks in advance, Davit

    Intermediate & Advanced SEO | | Davit1985
    1

  • Analytics is showing a substantial decrease in referring traffic from Google specific regional domains like .ca, .co.uk, .de, etc vs an uptick from .com starting as of March 2018. Did anyone note when this change happened when Google stopped directing traffic to their regional domains? Was there any press about it (couldn't find any). Using a VPN for different countries, I compared regional specific domain SERPs vs .com and they're pretty much identical. Thanks!

    Algorithm Updates | | Bragg
    1

  • I have 5 URLs that are "missing titles" however, all 5 are landing pages that were created in Pardot. how would I go about adding the missing title? Would I need to add it on our website platform or in Pardot?

    Technical SEO | | cbriggs
    0

  • Calling all those with ecommerce SEO chops! What are the high impact tasks that you would always start out your new campaigns with? I've got this far in my thinking: Identifying query classes, identifying the intent of each query class and setting up ranking indices ensuring templated META page titles match the most valuable query syntax identify any issues with crawl budget that might prevent deep product page indexing
    ???
    ??? Hoping this can be one of those discussion threads that is so useful people bookmark it for future reference! Many thanks!

    Intermediate & Advanced SEO | | QubaSEO
    1

  • There are 2 or 3 URLs and one image file that dozens of toxic domains are linking to on our website. Some of these pages have hundreds of links from 4-5 domains. Rather than disavowing these links, would it make sense to simply break these links, change the URL that the link to and not create a redirect? It seems like this would be a sure fire way to get rid of these links. Any downside to this approach? Thanks, 
    Alan

    Intermediate & Advanced SEO | | Kingalan1
    1

  • The vast majority of the 140 domains that link to our website are very low quality directories or and other toxic links. Only about 20-30 domains are not toxic (according to Link Research Tools confirmed by out manual inspection of these links). Would removing some of these links improve of MOZ Domain Rank? What if we cannot remove them, can NOZ detect a disavow file? In general would improving the ratio between good quality and poor quality links improve domain authority? Thanks,
    Alan

    Moz Bar | | Kingalan1
    2

  • I have an ecommerce store and i am using moz to get it into the best seo situation... my question is this..... I want to know how important it is to have the targeted keyword actually in the product page url.... I working on meta title and description which is good, but if i start changing all my product urls, it has major impact on the work i have to do since i would have to redo all my product links in ads, and all my product urls in emails, etc. So how much of a part do the urls play in seo?

    Intermediate & Advanced SEO | | Bkhoward2001
    0

  • (I searched but didn't find anything similar here) When you start a campaign, what is your method for finding the best keywords to optimize on your site and to track? Please share methodologies and tools on Moz that you use. I of course use Keyword explorer, and choose the higher volume keywords, but I am not sure I am looking in the most effective places. Any discussion on this would be greatly appreciated.

    Moz Pro | | bizmarquee
    1

  • There must be an easy answer to this, but I can't seem to find it. All I want to do is create a segment in Google Analytics that shows all pages and search strings with "orthopaedics" in the title, with pageviews, uniques etc. If I simply navigate to "All Pages" in Google Analytics and then click Advanced Filters and do an Include Page Contains "orthopaedics" it works just fine. (See attached Screen Shot) But when I try to recreate this as a segment, it pulls in all other pages the users visited before arriving on the orthopaedics page I want to include, which I don't want. I can manually exclude each URL I don't want, but this is tedious and I feel there must be a simpler method I'm just missing. At the end of the day, I'm trying to create a list of every page and dynamically created query string that includes the word "orthopaedics" to say doctor X, your orthopaedics section generated X views, and here's a list of the pages. Mm6YTKa

    Reporting & Analytics | | Patrick_at_Nebraska_Medicine
    0

  • When updating meta titles and descriptions, I'm taking note of whether Google is displaying the set tag or changing it to copy from the page. Does this affect the ranking position if Google is having to change the tag? How much should I worry if Google is choosing to change every other page? Thanks!

    On-Page Optimization | | Omar_aw
    0

  • Hi, I am looking for a bit of advice if possible. In October 2018 we did a site redesign for a website that we had acquired (www.drainageonline.co.uk) but we have lost SO much organic traffic since Oct (-40/50% each month). I have amended any crawl errors including broken redirects ensuring all 301's that we put in place are working correctly, ensured 404 pages working OK, implemented canonical onto all pages along with other bits but we don't really seem to be seeing any improvements in our organic traffic. Can anyone offer any advice on the above? Thanks, Lea

    Intermediate & Advanced SEO | | GAP_Digital_Marketing
    0

  • Hello, I have a question about a webshop that is closed* on sundays for religious reasons. Does it affect the ranking results in Google? closed = a full screen pop-up, which you can not click away

    On-Page Optimization | | AdenaSEO
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.