Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Over the past few months we have been chipping away at duplicate content issues. We know this is our biggest issue and is working against us. However, it is due to this client also owning the competitor site. Therefore, product merchandise and top level categories are highly similar, including a shared server. Our rank is suffering major for this, which we understand.  However, as we make changes, and I track and perform test searches, the pages that Google ranks for keywords never seems to match or make sense, at all. For example, I search for "solid scrub tops" and it ranks the "print scrub tops" category. Or the "Men Clearance" page is ranking for  keyword "Women Scrub Pants". Or, I will search for a specific brand, and it ranks a completely different brand. Has anyone else seen this behavior with duplicate content issues?  Or is it an issue with some other penalty?  At this point, our only option is to test something and see what impact it has, but it is difficult to do when keywords do not align with content.

    Intermediate & Advanced SEO | | lunavista-comm
    0

  • I have a question to moz pros. I have a blog about tattoos and i am using extremely low competition keywords in all my post. Some of my keywords are ranked in top 10 results in google but they are stuck, e.g. i've target a keyword "lion king tattoos" and it is currently ranking on no 6. its been a long time its stuck on no 6 and i am not getting any traffic from this keyword as well. i am very new in SEO and want some suggestions so that i can improve my rankings and traffic. my blog link is http://tattoosalbum.com please have a look and help me 🙂 thanx

    Link Building | | SyedNumanshah
    0

  • For example, I'd like to type in a zipcode and get the highest ranking websites by DA/whatever metric the software uses, within a 25 mile radius? Does that type of service exist? I'm looking to build up our local links, but most of the websites have extremely low authority. I'm trying to find some good ones without having to manually check each one. Thanks, Ruben

    Local SEO | | KempRugeLawGroup
    1

  • Hello users MOZ:
    I have a question about page and domain authority. It is OK that sometimes PA is greater than the DA? I think that's not right, but it happens in some domains, for example in the attached image. uB7HZhr.png

    Link Explorer | | NachoRetta
    1

  • Howdy folks, Every time we do an index update here at Moz, we get a tremendous number of questions about Domain Authority (DA) and Page Authority (PA) scores fluctuating. Typically, each index (which release approximately monthly), many billions of sites will see their scores go up, while others will go down. If your score has gone up or down, there are many potential influencing factors: You've earned relatively more or less links over the course of the last 30-90 days.
    Remember that, because Mozscape indices take 3-4 weeks to process, the data collected in an index is between ~21-90 days old. Even on the day of release, the newest link data you'll see was crawled ~21 days ago, and can go as far back as 90 days (the oldest crawlsets we include in processing). If you've done very recent link growth (or shrinkage) that won't be seen by our index until we've crawled and processed the next index. You've earned more links, but the highest authority sites have grown their link profile even more
    Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it's harder and harder to get these high scores and sites, on average, that aren't growing their link profiles substantively will see PA/DA drops. This is because of the scaling process - if Facebook.com (currently with a DA of 100) grows its link profile massively, that becomes the new DA 100, and it will be harder for other sites that aren't growing quality links as fast to get from 99 to 100 or even from 89 to 90. This is true across the scale of DA/PA, and makes it critical to measure a site's DA and a page's PA against the competition, not just trended against itself. You could earn loads of great links, and still see a DA drop due to these scaling types of features. Always compare against similar sites and pages to get the best sense of relative performance, since DA/PA are relative, not absolute scores. The links you've earned are from places that we haven't seen correlate well with higher Google rankings
    PA/DA are created using a machine-learning algorithm whose training set is search results in Google. Over time, as Google gets pickier about which types of links it counts, and as Mozscape picks up on those changes, PA/DA scores will change to reflect it. Thus, lots of low quality links or links from domains that don't seem to influence Google's rankings are likely to not have a positive effect on PA/DA. On the flip side, you could do no link growth whatsoever and see rising PA/DA scores if the links from the sites/pages you already have appear to be growing in importance in influencing Google's rankings. We've done a better or worse job crawling sites/pages that have links to you (or don't)
    Moz is constantly working to improve the shape of our index - choosing which pages to crawl and which to ignore. Our goal is to build the most "Google-shaped" index we can, representative of what Google keeps in their main index and counts as valuable/important links that influence rankings. We make tweaks aimed at this goal each index cycle, but not always perfectly (you can see that in 2015, we crawled a ton more domains, but found that many of those were, in fact, low quality and not valuable, thus we stopped). Moz's crawlers can crawl the web extremely fast and efficiently, but our processing time prevents us from building as large an index as we'd like and as large as our competitors (you will see more links represented in both Ahrefs and Majestic, two competitors to Mozscape that I recommend). Moz calculates valuable metrics that these others do not (like PA/DA, MozRank, MozTrust, Spam Score, etc), but these metrics require hundreds of hours of processing and that time scales linearly with the size of the index, which means we have to stay smaller in order to calculate them. Long term, we are building a new indexing system that can process in real time and scale much larger, but this is a massive undertaking and is still a long time away. In the meantime, as our crawl shape changes to imitate Google, we may miss links that point to a site or page, and/or overindex a section of the web that points to sites/pages, causing fluctuations in link metrics. If you'd like to insure that a URL will be crawled, you can visit that page with the Mozbar or search for it in OSE, and during the next index cycle (or, possibly 2 index cycles depending on where we are in the process), we'll crawl that page and include it. We've found this does not bias our index since these requests represent tiny fractions of a percent of the overall index (<0.1% in total). My strongest suggestion if you ever have the concern/question "Why did my PA/DA drop?!" is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it's more likely related to relative scaling or crawl biasing issues, not to anything you've done. Remember that DA/PA are relative metrics, not absolute! That means you can be improving links and rankings and STILL see a falling DA score, but, due to how DA is scaled, the score in aggregate may be better predictive of Google's rankings. You can also pay attention to our coverage of Google metrics, which we report with each index, and to our correlations with rankings metrics. If these fall, it means Mozscape has gotten less Google-shaped and less representative of what influences rankings. If they rise, it means Mozscape has gotten better. Obviously, our goal is to consistently improve, but we can't be sure that every variation we attempt will have universally positive impacts until we measure them. Thanks for reading through, and if you have any questions, please leave them for us below. I'll do my best to follow up quickly.

    Link Explorer | | randfish
    13

  • Is it bad to have the same H1 & H2 tag on one page? I found a similar question here on the moz forum but it didn't exactly answer my question. And will adding "about" on the H2 help, or should we avoid duplicate tags completely? Here is a link to the page in question (which will repeat throughout this site.) Thanks in advance!

    Technical SEO | | Mike.Bean
    0

  • Hi What is the best tool out there to doing a full brand audit for company.  They've created at least 15 websites and have possibly many domain names which have been ordered by different people over many years. I need to find the quickest and best tool to do this with. Any suggestions would be great.

    Moz Pro | | Cocoonfxmedia
    0

  • Hello, our domain authority dropped significantly overnight from 37 to 29. We have been building good links from high DA sites and producing regular, good quality content. Anyone able to offer any ideas why? Thanks

    Reporting & Analytics | | ProMOZ123
    1

  • Here's the scenario. We're doing SEO for a national franchise business. We have over 60 location pages on the same domain, that we control. Another agency is doing PPC for the same business, except they're leading people to un-indexable landing pages off domain. Apparently they're also using location extensions for the businesses that have been set up improperly, at least according to the Account Strategists at Google that we work with. We're having a real issue with these businesses ranking in the multi-point markets (where they have multiple locations in a city). See, the client wants all their location landing pages to rank organically for geolocated service queries in those cities (we'll say the query is "fridge repair"). We're trying to tell them that the PPC is having a negative effect on our SEO efforts, even though there shouldn't be any correlation between the two. I still think the PPC should be focused on their on-domain location landing pages (and so does our Google rep), because it shows consistency of brand, etc. I'm getting a lot of pushback from the client and the other agency, of course. They say it shouldn't matter. Has anyone here run into this? Any ammo to offer up to convince the client that having us work at "cross-purposes" is a bad idea? Thanks so much for any advice!

    Local Website Optimization | | Treefrog_SEO
    0

  • We have been seeing some strange things happen in Google local after the most recent update. We used to show up in the maps all the time and have made no major edits or changes to the profile. Now when we search for our services, we show up high in the organic results, and not at all in maps (local listings). We have our profile setup as a service area since we do meet with people and provide services at their location, but also have checked the option that we also serve people at our address. I am wondering if the recent update favors actual storefronts when people are searching for services. Any ideas? Technically all the actual work is provided at our location, and the service we provide at the service area locations is based upon consultations. If we switched it to an actual storefront listing could that possibly help? Our profile is fairly strong, and has reviews, long history of posts, etc. What gives Google?

    Local Listings | | David-Kley
    1

  • Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR

    Intermediate & Advanced SEO | | adzhass
    0

  • Hey guys, We have been working for a client who is offering graphic design work almost 2 months. It is a new business and let's say the business name is ABC Graphic Design. So far all the pages are indexed, we built natural links through local directories, blog postings on relevant niche blogs and social media. We optimised the content and meta tags like we always do. However, none of the target keywords appear on the first 10 pages. This is quite odd considering we had a client who was doing the same business and we managed to show some progress in the first 2 months. We did some research and noticed that there are 2 ABC design websites with similar domain names and offering same services. They have nothing to do with my client and they are located in overseas. When i search ABC Graphic Design, the results show other companies instead of my client. My question is whether having a similar business name would affect the ranking. Obviously the other 2 websites have longer history and better ranking. Any suggestions?

    Intermediate & Advanced SEO | | owengna
    0

  • Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
    I think that the ccTLD is better always, for example ".es" better than ".com.es"

    Local Website Optimization | | NachoRetta
    0

  • In order to emulate different locations, I've always done a Google query, then used the "Location" button under "Search Tools" at the top of the SERP to define my preferred location.  It seems to have disappeared in the past few days?  Anyone know where it went, or if it's gone forever?  Thanks!

    Technical SEO | | measurableROI
    0

  • Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!

    Intermediate & Advanced SEO | | TheDude
    0

  • I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben

    Technical SEO | | KempRugeLawGroup
    1

  • Hi! My UK based company just recently made the decision to let the US market operate their ecommerce business independently. Initially, both markets were operating off the same domain using sub-directories (i.e: www.brandname.com/en-us/ , www.brandname.com/en-gb/ ) Now that the US team have broken away from the domain - they are now using www.brandnameUSA.com while the UK continues to use www.brandname.com/en-gb/. The content is similar across both domains - however, the new US website has been able to consolidate several product variations onto single product pages where the UK website is using individual product pages for each variation. We have placed a geo-filter on the main domain which is 301 redirecting North American traffic looking for www.brandname.com to www.brandnameUSA.com However, since the domain change has taken place, product pages from the original domain are now indexing alongside the new US websites product pages in US search results. The UK website wants to be the default destination for all international traffic. My question is - how do we correctly setup hrlang tags across two separate TLDs and how do we handle a situation where multiple product pages on the "default" domain have been consolidated into one product page on the new USA domain? This is how we are currently handling it: "en-us" href="https://www.BRANDNAMEUSA.com/All-Variations" /> href="https://www.BRANDNAMEUSA.com/All-Variations" />

    International SEO | | alexcbrands
    0

  • Hi All, I wondered if anyone has seen any ranking improvements from adding a GTIN (barcode) number to their product pages?

    On-Page Optimization | | Jon-S
    0

  • Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
    Mark

    White Hat / Black Hat SEO | | uworlds
    0

  • Hello SEO gurus We have an issue here ( www.xyz.com.au) is having 200 responses www.xyz.com.au and www.xyz.com.au/ ( when i ran the crawl test i found this ) We have been advised to do a 301 from non slash to slash ( as our other pages are showing up with slash ) for the consistency we decided to go with this but our devs just couldnt do it. Error is - redirect loop and this site is a wordpress one Can anyone help us with this issue? Help is much appreciated.

    Technical SEO | | Pack
    0

  • I have a client that wants to apply video object schema to their iframe youtube video. Here is the source code: <iframe src="http://www.youtube.com/embed/clientvideo" width="272" height="202" frameborder="0" allowfullscreen=""></iframe> Is it possible to apply schema markup to this kind of iframe source code?   Our development team was having a hard time with it. Thanks!

    Intermediate & Advanced SEO | | RosemaryB
    0

  • I'm getting good, solid growth in my Google SERPs and Google search traffic now, but I do notice that 70% of my high ranking search results are images and the CTR on those is only 3-4%. All my images are illustrative and highly relevant to my travel blog, but I guess that hardly matters unless they get CTR so people see them in context. Has anyone seen or done any good research on what makes people click through on Google Image Search results? What are the key factors? How do you optimize for click-through? Is it better to watermark your images or overlay label them to increase likelihood of click-through? Thanks, Tony FYI the travel blog in question is www.asiantraveltips.com and a relevant Google search where I rank highly is "songkran 2016 phuket".

    Intermediate & Advanced SEO | | Gavin.Atkinson
    0

  • What is the recommended size for an image on our website? What is the largest image size we can use without being penalised by Google? Thank you.

    Image & Video Optimization | | CostumeD
    0

  • hi anyine  any issues with using Spanish, and other non English words, as domain names when trying to rank in Google uk. We launched a number of websites a while back but finding it hard to get much traction in Google uk. We are getting a reasonable number of impressions but cannot seem to get very high in the rankings. All the names are foreign words for their service. Our homeware website, for example, uses the basque word for furniture as its name. other than potential branding issues of having domains people might struggle to spell, is there any serp issues we would face with these names. thanks

    International SEO | | Arropa
    0

  • Hi, So right now my vanity URLs have a lot more links than my regular homepage. They 301 redirect to the homepage but I'm thinking of canonicalizing the homepage, as well as the mobile page, to the vanity URL. Currently some of my sites have a vanity URL in a SERP and some do not. This is my way of nudging google to list them all as vanity but thought I would get everyone's opinion first. Thanks!

    White Hat / Black Hat SEO | | mattdinbrooklyn
    1

  • Hi friends, I know this is a minor technical change, but we are in an extremely competitive market and I don't want to have any points against us. On our WordPress Category pages i.e. http://www.domain.com/category/�tegory-title%/ I looked at the code behind the the Title of the category page, which is "Browsing: %Category Title%" The code is an h2. I look at the posts in the category archive below, and those are also h2's. The theme preview is here and you can click on Entertainment - Reviews to see exactly what I'm referring to - http://themeforest.net/item/smartmag-responsive-retina-wordpress-magazine/full_screen_preview/6652608 I changed the code for the "Browsing: %Category Title%" to h1, which I believe is more consistent and standard formatting. 1. Is this a correct technical on-page optimization? 2. Would it be beneficial to remove "Browsing"?

    Web Design | | JustinMurray
    0

  • Howdy there!   Two schema related questions here Schema markup for local directory We have a page that lists multiple location information on a single page as a directory type listing.  Each listing has a link to another page that contains more in depth information about that location. We have seen markups using Schema Local Business markup for each location listed on the directory page.  Examples: http://www.yellowpages.com/metairie-la/gold-buyers http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location.  Other sites such as Yelp etc have no markup for a location at all on a directory type page. We want to stay with schema and leaning towards the superpages option.  Any opinions on the best route to go with this? Schema markup for logo and social profiles vs website name. If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup https://developers.google.com/structured-data/customize/social-profiles If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup. https://developers.google.com/structured-data/site-name We want to have the markup for the logo, social profiles and website name.  Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?).  Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website.  Does this matter?  Will Google be ok with the logo and social profile markup if we use the @website designation? Thanks!

    Local Website Optimization | | HeaHea
    0

  • I read all the time about how directories have very little weight in SEO anymore, but in my field, a lot of our competitors are propped up by paying for "profiles" aka links from places like martindale-hubbard, superlawyers, findlaw, nolo, Avvo, etc (which are essentially directories IMO) yet all those sites have very high DAs of 80 and above. So, are links from these sites worth it? I know that's a vague questions, but if Moz's algo seems to rank them so highly, I'm guessing that's reasonably close to what google thinks as well...maybe? Thanks for any insight, Ruben

    Algorithm Updates | | KempRugeLawGroup
    0

  • Hello All, Yesterday when i was checking with the one of the search term for "canvas prints"..i come to know that one of the advertiser EasyCanvasPrints.com is showing Incorrect price. On Search result they guys are showing $7.46 incorrect price for the PLA - Product listing ads. Go to below links: https://www.google.com/search?hl=en&site=webhp&tbs=vw:l,mr:1,seller:8521978&tbm=shop&q=canvas+prints&sa=X&ved=0CJkGELMrahUKEwjmj-PSgfzIAhUHBY4KHc0VBBQ https://www.google.com/?gws_rd=ssl#q=canvas+prints&tbm=shop https://www.google.com/?gws_rd=ssl#q=canvas+prints&start=0 But when we click on it then i can not this price $7.46 http://www.easycanvasprints.com/single-canvas?singlecanvas=1&height=19&width=14&pcode=5345334C6B74647246774137536C786768544B5458776C35644F747855796D39&utm_source=google_base&utm_medium=data_feed I have done various claim and send to Google support about this misguiding price for the Easy Canvas Prints but they not taking any actions on it. Guy Can u please help me how can i pass this message to Google to take actions for this advertiser? Waiting for your reply Regards Dinesh

    Paid Search Marketing | | CommercePundit
    0

  • Hello, I would love it if anyone could advise whether to use VPS, shared or dedicated hosting to host 50 WordPress sites and around 50 other one page flat HTML sites ... with everything being served via CloudFlare. Of course my main concerns are page load times, security etc. I know it's a big question, but I would be very appreciative of the thoughts people have, and any recommendations. Kind regards,
    Mark

    Local Listings | | uworlds
    0

  • Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
    About 2,920 results (Google search "site:example.com")
    Sitemap: 1,229 URLs
    Screemingfrog Spider: 1,352 URLs Cheers,
    Jochen

    Intermediate & Advanced SEO | | Online-Marketing-Guy
    0

  • Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?

    Intermediate & Advanced SEO | | RobAus
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.