Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Searched - not seeing this exact issue(sorry if I'm wrong). I'm my report I'm seeing that we have old URLs in the wild using http.  Those requests come to our site, and then are redirected to the https version.  However if this url does not exist, then there's another redirect 301 to the actual page. This obviously causes a redirect chain.  I've searched  but don't see a clear answer to this.  How can I avoid the chain?  How can I go from the http to the new https URL without first incurring the double redirect? "Look for any recurring chains that could be rewritten as a single rule. " FWIW I don't see any http to https rewritecond's in my htaccess, and cannot trace where that redirect is generated from.

    Link Building | | qozmiq
    1

  • Hey everyone! Thanks in advance for any help on this. I work for a SaaS company that has all of our customer apps and assets on our company domain. This has resulted in a lot of backlinks pointing to our domain, and a lot of pages indexed as well. I'm working with product to migrate all customers onto a separate domain, but a concern is that we need to still move the customer content to the new domains somehow without passing any of this backlink info. Am I correct in my assumption that if we 301 all of the apps and assets, all of that backlink info stays the same? What would be the best way to do this? Could we 302 everything and then wait like, 30 days and delete the 302? Would that still fix the problem, or does all of that backlink data "stick" after the 302 is deleted? Any additional thoughts would be extremely helpful!!

    Technical SEO | | rachelmeyer
    0

  • I am working with a business that has setup their google my business as: 1 main GMB for store location with its physical address and unique phone - ie business name "ABC Lumber" 1 GMB account for the business name followed by the city name ie. "ABC Lumber Santa Monica" with an address in Santa Monica and different phone than the store location GMB Account  but points to same website. Another GMB account for called. "ABC Lumber Redondo Beach" with an address and same phone as the Santa Monica GMB account but points to same website. So basically there is a main account which is the store, and then the sales reps want to create multiple GMB accounts of their own that are service area only accounts. Do you see a problem with this?

    Local Listings | | lkbackus
    0

  • Hi, While prospecting  for backlinks target  do you guys filter links less than a specific DA or PA? for instance - if PA<5 then remove
    My VA accidentally selected a PA 3 link where the site owner eventually gave us a backlink. 
    Due to that link the traffic jumped 14X!!!
    Had it been my filtering process would never got the backlink and traffic. What do you guys suggest?

    Link Building | | Janki99
    0

  • Hey everyone, I'm working with a local catering company that does not have a physical address available for use.  Because of privacy concerns, the company is not open to using their home address.  The local competition in the targeted area is fairly strong and established already.  Does anyone have ideas on how I can work around this?

    Local Listings | | a_toohill
    0

  • To cut a long story short, our old web developers who built us a bespoke site decided that they could no longer offer us support so we decided to move our back end to the latest Magento 2 software and move over to https with a new company. The new setup has been live for 3 weeks, I have checked in webmaster tools and it says we have 4 pages indexed, if I type in site:https://www.mydomain.com/ we have 6560 pages indexed, our robots.txt  file looks like this:Sitemap: https://www.mydomain.com/sitemap.xml Sitemap: https://www.mydomain.com/sitemaps/sitemap_default.xml I use Website Auditor and Screaming Frog, Website Auditor returns a 302 for my domain and Screaming Frog returns a 403 which means I cannot scan any of these. If I check my domain using an https checking tool some  sites return an error but some return a 200.
    I have spoken to my new developer and he says everything is fine, in Webmaster tools I can see some redirects from his domain to mine when the site was in testing mode. I am concerned that something is not right as I always check my pages on a regular basis. Can anyone shed any light on this, is it right or am I right to be concerned. Thank you in advance

    Reporting & Analytics | | Palmbourne
    0

  • Let's say on my keyword If I take the 1 st 20 results of google they have an average number of words per page of 3000 words. On my page I only have 1500 words. Does it mean that the frequency of my related keywords should be half of what the others have ? What about if I am over their frequency is it a problem also ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hi I monitor and track few Top sites that are Industry leaders. I got -1 to -3 in DA across this month. Has anyone else experienced this and the matter of fact is there has been rare or no change in the sites but the moz backlink data experienced a major drop "more then 100K backlinks were dropped for them in yesterday's report as compared to a 20 day old report". Has anyone at Moz or anyone in the community experienced the same

    Link Explorer | | mr.ankit.manocha
    0

  • I understand how to do basic keyword research and how to review a competitor's code (meta tags, etc.). But what is the best method for finding out what a competitor's target keyword is for a particular web page? For example, the URL www.example.com/about-us, what is the best method for finding out what their target keyword is? I have been using a keyword density tool to run the URLs and find the keyword or phase that appears most often but would think there is a better way?

    Keyword Research | | rx300
    0

  • Hi there: We have a client whose website we built in WP, using Yoast Pro as our SEO plugin. I was reading some reports (actually coming out of SEMrush but we use Moz as well) and I am getting really varying results in the description are of the SERPS. Even though I'm seeing the copy we wrote in Yoast in the description tag code, the SERP is showing an excerpt from the copywriting on the site. What's even weirder is that SEMrush is pulling an entirely DIFFERENT description. I'm obviously missing out on the finer points of description tags, as Google clearly does not always choose to feature what is actually written in the description tag itself. Can someone explain to me what might be going on here? Thanks in advance,

    Intermediate & Advanced SEO | | Daaveey
    1

  • As I'm looking through my Moz Pro reports on Pages with Duplicate Content, almost all the results are from the automatically created "tag" pages from my blog. I.e., takeflyte.com/flyte/tags/kickapps Should I worry about this? Does it have a negative impact in my search visibility? Should I be using canonical tags on these pages (and if so, pointing them where if there's multiple pages that use the same tags?) How would you recommend handling this issue?

    Product Support | | flyte
    0

  • Hi I want to shorten some URLs, if possible, that Moz is reporting as too long. They are all the same page but different categories - the page advertises jobs but the client requires various links to types of jobs on the menu. So the menu will have: Job type 1
    Job type 2
    Job Type 3 I'm getting the links by going to the page, clicking a dropdown to filter the Job type, then copying the resulting URL from the address bar. Bu these are really long & cumbersome. I presume if I used a URL shortener, this would count as redirects and alsonot be good for SEO. Any thoughts? Thanks
    Ann

    Intermediate & Advanced SEO | | Ann64
    0

  • Hello, Is there a correlation between the keyword difficult and the time it takes to rank ? In other words let's say I try to rank for  the keyword "seo" and it is going to take 2 years to rank 1 st whereas if I go for "best seo tools in 2018" and it takes just 2 weeks ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • We have a website http://www.example.co.uk/ which leads to another domain (https://online.example.co.uk/) when a user clicks,in this case let us assume it to be Apply now button on my website page. We are getting meta data issues in crawler errors from this (https://online.example.co.uk/) domain as we are not targeting any meta content on this particular domain.  So we are looking to block this domain from getting indexed to clear this errors & does this effect SERP's of this domain (**https://online.example.co.uk/) **if we use no index tag on this domain.

    Technical SEO | | Prasadgotteti
    0

  • Hello, Does AMP have ranking benefits ? Should I just AMP my post or all the pages of my website, product page, homepage etc... Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hi Guys, Am I going slightly mad but why would you want to have a redirect and a canonical redirecting back to the same page. For Instance https://handletrade.co.uk/pull-handles/pull-handles-zcs-range/d'-pull-handle-19mm-dia.-19-x-150mm-ss/?tag=Dia.&page=2 and in the source code:- <link href="<a class="attribute-value">https://handletrade.co.uk/d'-pull-handle-19mm-dia.-19-x-150mm-ss/</a>" rel="<a class="attribute-value">canonical</a>" /> Perfect! exactly what it is intended to do. But then this page is 301 redirected tohttps://handletrade.co.uk/pull-handles/pull-handles-zcs-range/d'-pull-handle-19mm-dia.-19-x-150mm-ss/ The site is built in open cart and I think it's the SEO plugin that needs tweaking. Could this cause poor SERP visibility? This is happening across the whole site. Surely the canonical should just point to the proper page and then there is no need for an additional bounce.

    Technical SEO | | nezona
    1

  • HI everyone, I've been having an issue with a severe drop in rankings (#2 to #36ish). All of my technicals seem to be ok, however I seem to be getting my images hotlinked (which I have killed in nginx) from these spam like pages that pull and link to an image on my site, then link again with a " . " for the anchor. Even more strange is that these pages are titled and marked up with the same titles and target key words as my site. For example, I just got a link yesterday from a site leadoptimiser - d o tt-  me which is IMO a junk site. The title of the page is the same as one of my pages, the page is pulling in images relevant to my page, however the image sources are repos EXCEPT for 2 images from my site which are hotlinked to my pages image and then an additional <a>.</a> link is placed to my website. I have gotten over 1500 of these links in the past few months from all different domains but the website (layout etc) is always the same. I have been slowly disavowing some of them, but do not want to screw up anything in case these links are already being discounted by G as spam and not affecting my rank. The community seems to be really split on the necessity of disavowing links like these. Because of these links, according to Ahrefs, my backlink profile is 38% anchor text of "." . Everything else checks out in my own review as well as Moz tools and Ahrefs with very high quality scores etc. Webmasters is fine, indexing is fine, pagespeed insights is in the 90's, ssl is A+. I've never had to deal with what seems to be an attack of this size. Thanks.

    White Hat / Black Hat SEO | | plahpoy
    1

  • Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.

    Intermediate & Advanced SEO | | wickstar
    1

  • Hi all, We are planning to de-index and redirect a sub domain A to sub domain B. Consequently we now need to d-index sub domain B also. What happens now to the link juice or page rank they gained from hundreds and thousands of backlinks? Will there be any ranking impact on main domain? Backlinks of these sub domains are not much relevant to main domain content. Thanks

    Algorithm Updates | | vtmoz
    1

  • This question has come up a few times with some of our clients and I've spent some time researching this question, but I can't find an answer online so hopefully, someone at MOZ has this data available to them with all the data they collect. The data points that would be needed to answer this question off the top of my head: Increase in the # of Google Searches in the US YoY The decrease in CTR for organic results "10 blue links" which take a searcher off of Google YoY, as Google continues to keep more searchers on Google.com with rich snippets, increased AdWords prominence, AdWords extensions, etc I'm sure this greatly varies per industry, but an average for all industries is all that is needed to answer this client question. Many thanks in advance and I've included a video which hopefully helps to better explain the search "plus/minus" that we can expect to see as SEOs in 2018. WF1yLlJC6LetnpbD3

    Search Behavior | | WebpageFX
    1

  • I work for a company that owns and manages apartments.  I would like to know which of the two website design decisions are better from an SEO perspective: One single website that contains pages for all of our apartments.  (Example: http://www.equityapartments.com) Separate websites for each apartment and one main corporate website allows users to search through our apartments.  (Example: https://www.greystar.com) I have spoken to three marketing companies have all recommended option 2.  The best reason I have heard is because then the separate apartments are all more likely to rank.  They say Google doesn't want to rank multiple pages of the same website.But Google would still know that I have an administrative relationship between the sites.  (Source: https://moz.com/blog/how-google-knows-what-sites-you-control-and-why-it-matters-whiteboard-friday)  So I don't know why they would treat multiple sites differently than one site?For what it's worth, it seems the majority of apartment management companies use a different website for each property.So should have a separate website for each of their properties?

    Local Listings | | mikleing
    1

  • Hi, As you told that free trail for a month but 1 USD deducted from my credit card. please clarify. Thanks & Regards, Keshav

    Getting Started | | DilSeDeshi
    0

  • Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc.  The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders.  Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd?   What if there are a number of quality links pointing to the complaint pages?  Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken

    Intermediate & Advanced SEO | | KenSchaefer
    0

  • Hi folks, My homepage has 3 identical H1 tags due to the fact that I have had to create individual hero images (with headings) for desktop, tablet and mobile. I couldn't get my theme to display the layout in exactly the way I wanted on each device without doing a specific hero image and tag for each device type. Does this have a major impact on my SEO? Thanks,
    Mike.

    On-Page Optimization | | Veevlimike
    0

  • Hi Mozers, How come Moz reports just six 404 errors, whereas Google Search Console reports 250 and Screaming Frog only reports a dozen? It seems to me that these results are all over the place. Shouldn't these reports be more consistent? I do understand that Search Console includes historical data and that URLs or issues need to be "marked as fixed" in order for them to go away, however, even if I do this, Google ends up reporting far more errors than anything else. Do 404s reported by Moz and Screaming Frog NOT include external links? It seems to me that this could be partially responsible for the issue. Also, is there a way to efficiently track the source of the 404s besides clicking on "Linked From" within Search Console 250 times? I was looking for something like this is Moz or SF but no luck. Any help is appreciated. Thanksabunch!

    Moz Pro | | EricFish
    0

  • Newbie, so please forgive!! OK, so I'm doing my 1st site optimization. It is reporting errors from pages that were deleted a couple of days ago. And I JUST signed up today. Where is this info coming from? Thanks, Billy

    Link Explorer | | NewSEOguy
    0

  • Did Moz have a citation report ?

    Getting Started | | Vbleuwire
    0

  • It looks like ExpressUpdate.com is down and while I know nobody actually uses Citysearch anymore I was wondering if anyone else had a solution for updating NAP data there.

    Local Listings | | r1200gsa
    0

  • Hello, During the site crawl in the week of Jan 29th, my site experienced a somewhat significant dip in search visibility. While all my dashboards for the brand experienced a dip, it was most noticeable in a dashboard focused on a specific region. Overall, I didn't make any major changed to the site, nor did the competitors I'm tracking. FWIW, the competitors I'm tracking stayed steady and didn't move up or down in visibility during my dip. I stay fairly up-to-date with SEO best practices and I don't believe there were any algorithm changes during this time. Now, there is one possible reasoning for this dip. Sometime before the week of Jan 29th, we began a new web promotion. To track this promotion, I added some minor JavaScript in the header. This additional JavaScript did slow the site speed slightly, but only to a rating of "Average" based on PageSpeed Insights. Does anyone have any insights as to why my site may have suffered a dip during the week of Jan 29th? Thanks.

    Local Listings | | Dions
    0

  • For the past few mounts my weekly site crawl has been inconsistent. One week works fine, it crawls all of my 500 or so pages. The following week it only crawls 1 page (http://mydomain.com) and nothing else. A few weekly scan go by and the crawl is back up the the 500 or so pages.I went ahead and created several campaigns with  duplicate settings and crawled the site. Most times but not all the new campaign's crawl works fine crawling all pages. But within a week or two the weekly crawl will fail again. (crawling 1 page). Currently i have four campaign's all with the same settings running weekly crawls.  2 campaign's crawled the 500 pages and two crawled only the single page. Any help will be greatly appreciated

    Moz Bar | | dmaude
    0

  • I know what information I need to pull... I know I need APIs to do it... I just don't know how to pull it or where. I have tools like Screaming Frog, Scrapebox, SEMRush, Moz, Majestic, etc. I need to find out how to type in a query and pull the top 10 ranking specs like DA, PA, Root Domains, Word Count, Trust Flow, etc. Here is a screenshot of info I manually pulled... https://screencast.com/t/H1q5XccR8 (I can't hyperlink it... it's giving me an error) How do I auto pull this info?? H1q5XccR8

    API | | LindsayE
    0

  • Hey Moz Family! I'm looking for a screen share platform, like gotomeeting, for our agents. However, we need one that does not require the other party to download anything. I know I came across it once... they sent me a link that prompted for a code and that was it. We work with seniors, half our problem is getting them to the right place on there computer. Having them download something to see their screen will be to hard. Any recommendations? Thanks!

    SEO Learn Center | | LindsayE
    1

  • Please check following 2 cases https://www.screencast.com/t/fqOJlrfHuNto https://www.screencast.com/t/G6FCaKlH In above case, why original content outranked? Can someone tell me how to stop this copying process done by another site from my website?

    Search Behavior | | Janki99
    0

  • Hello, Is it still devaluated by google ? It seems that on mobile it isn't anymore but what about desktops ? Thank you,

    Web Design | | seoanalytics
    0

  • Hi, I currently have built my website with internal links to boost my category pages (the ones that are the most important with the most traffic). Is the way to go or nowadays or doesn't it matter what pages you boost. I beleive  what I did is some sort of "page rank sculpting" but is it still useful ? or I should I link natural across my website even though some pages will get a boost and some won't ? The link that I have are at the bottom of my content (not in the footer) but at the bottom of the page next to each other ? does they count or is it really bad to have them that way ? I have a blog and I was wondering if when I link from my blog page for example about "Barolo" to my page about "Piedmont" if I should also link from my page about "Piedmont" to my Barolo page or if the link only has to go one way from my blog / Barolo page to my piedmont page to boost Piedmont. Thank you,

    Link Building | | seoanalytics
    0

  • Hello, With the new mobile indexing 1 st do search engine (google) give as much value to content in tabs and no visible in the 1 st place as content which is visible on the page ? Thank you,

    Intermediate & Advanced SEO | | seoanalytics
    0

  • My designer and I have been having an argument: we have a blog with short, 400 words posts. They have an H1 with nice keywords and a catchy title, and then a few subheadings. I don't like making the subheadings H2, because the font looks way too large in Wordpress, so my designer wants to make them all H4s, so the font looks to be a nicer size. Here's my problem with that and why I usually just bold the subheadings: Is it really bad to put a bunch of H4s right under an H1, with not H2's or 3's to separate? I'm reading different arguments on the internet about this and gladly welcome more debate and/or case studies. Thank you!

    Intermediate & Advanced SEO | | genevieveagar
    0

  • Hi all, Wondering the difference between the Inbound Links stats for https://www.example.com versus only using example.com. When I type in the https:// version, I see 3 Spam Flags - for the example.com version, there are none. There are also spammy links pointing to the https version as opposed to the non-https version when I use this tool: https://moz.com/researchtools/ose/ Anyone know the difference and which should be regarded when fixing spam issues?

    Link Explorer | | netamorphosis
    0

  • Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think! 
    Thanks!

    Web Design | | TaylorRHawkins
    2

  • I know a lot of the MOZ warnings can be ignored, however, I'm trying to figure out of this one should be added to that list: my store has urls setup like this for categories: https://www.mysite.com/sweaters https://www.mysite.com/sweaters/page/2 The meta title is "Sweaters" for both pages. Is that bad practice? I don't think I can automatically change the meta title to to Sweaters Page 2 or even want to. or should I do that? Or just ignore these type of warnings?

    Moz Pro | | IcarusSEO
    0

  • As far as I know, there is no way to truly FORCE a URL to be removed from Google's index. We have a page that is being stubborn. Even after it was 301 redirected to an internal secure page months ago and a noindex tag was placed on it in the backend, it still remains in the Google index. I also submitted a request through the remove outdated content tool https://www.google.com/webmasters/tools/removals and it said the content has been removed. My understanding though is that this only updates the cache to be consistent with the current index. So if it's still in the index, this will not remove it. Just asking for confirmation - is there truly any way to force a URL out of the index? Or to even suggest more strongly that it be removed? It's the first listing in this search https://www.google.com/search?q=hcahranswers&rlz=1C1GGRV_enUS753US755&oq=hcahr&aqs=chrome.0.69i59j69i57j69i60j0l3.1700j0j8&sourceid=chrome&ie=UTF-8

    Intermediate & Advanced SEO | | MJTrevens
    0

  • Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
    https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
    https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
    https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
    https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
    https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
    https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this? 
    Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend. 
    Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
    Chris Gorski

    Intermediate & Advanced SEO | | kirin44355
    0

  • Hi, I am working on a site that is experiencing indexation problems: To give you an idea, the website should be www.example.com  however, Google seems to index www.example.co.uk  as well. It doesn’t seem to honour the 301 redirect that is on the co.uk site.  This is causing quite a few reporting and tracking issues. This happened the first time in November 2016 and there was an issue identified in the DDOS protection which meant we would have to point www.example.co.uk to the same DNS as www.example.com.   This was implemented and made no difference. I cleaned up the htaccess file and this made no difference either.  In June 2017, Google finally indexed the correct URL, but I can’t be sure what changed it. I have now migrated the site onto https and www.example.co.uk has been reindexed in Google alongside www.example.com I have been advised that the http needs to be removed from DDOS which is in motion I have also redirected http://www.example.co.uk straight to https://www.example.com to prevent chain redirects I can’t block the site via robot.txt unless I take the redirects off which could mean that I lose my rankings.  I should also mention that I haven't actually lost any rankings, it's just replaced some URLs with co.uk and others have remained the same. Could you please advise what further steps I should take to ensure the correct URL’s are indexed in Google?

    Technical SEO | | Niki_1
    0

  • Hi everyone! I have a website (ponturipariuri.pro), few years old. The PA and DA used to be around 25-35 but now it shows me 1 for both PA and DA. I just discovered it and I do not understand why. Take a look at the image I attached here. There was no spam on my website. Google still shows my website on the first page for many keywords. 3ojp7qt.png

    Link Explorer | | adi2305
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.