Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Category: Moz Pro

Discuss the Moz Pro tools with other users.

Subcategories

  • Chat keyword research strategy and how Keyword Explorer helps you do your best work.

  • Cover all things links and the industry-leading link data discoverable in Link Explorer.


  • When I crawl my site as a root domain, I get more errors is my campaign than when I set my site as a sub domain. Which one is the correct way: root domain or subdomain. My site is www.aa-rental.com

    | tanveer1
    0

  • I've noticed that for some country top level domains (tld) the domain authority returned by open site explorer is based on the domain that has been registered within the tld. For example, domainname.co.uk provides a domain authority specific to the domain. However, for some other country top level domains, this does not appear to be the case.  Examples I have found include: domainname.co.nr domainname.co.pt domainname.co.ee For these top level domains the domain authority seems to be the same for every domainname, seemingly implying the domain authority is for the top level domain itself rather than for the domain. Is this a common situation for many country top level domains, so that what I see going in here is the tip of a large iceberg, or does this situation just apply to a very isolated set of country top level domains?

    | MichaelCorfman
    0

  • Hi All I have just joined on the trial offer, im not sure if i can afford the monthly payments, but im hoping SEOmoz will show me that i also cannot afford to be without it! In my proses of learning this site and flicking through each section to see what things do. However when i enter my URL into Site Explorer i get the following message "No Data Available for this URL" My site should be crawl-able, so how do i get to see data for my site/s. I wont post my URL here, as the site has a slightly adult theme.
    If anyone could confirm if i can post "slightly adult" sites. Best Regards
    Jon

    | jonny512379
    0

  • I noticed that my ranking report puts be at 33 in google for one term, but when I use the Rank Tracker under Research tools it places me at 15.

    | trainsimple
    0

  • I'm getting this message "We were unable to grade that page. We received a response code of 403. URL content not parseable" when using the On-Page Report Card.  Does anyone know how to go about fixing this?  I feel like I've tried everything.

    | Sean_McDonnell
    0

  • How long does it take to get approved for the affiliate program? It's been 3 days so just curious.

    | MyAllenMedia
    1

  • I'm analyzing a site and the page authority is the exact same for every page in the site. How can this be since the page authority is supposed to be unique to each page?

    | azjayhawk
    0

  • What are the IP addresses for the SEO Web Crawler? There is a firewall on my clients website before it goes live, I would like to crawl the site before it goes live, but need to provide the web crawlers IP addreses. Thank you for your time

    | sfchronicle
    1

  • What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results. In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more. Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.

    | anthonytjm
    0

  • I just ran a SERP/keyword diffculty report for a keyword I want one of my pages to rank Also, I just conducted the on-the-page-optimization and now I am going to start buidling links. => I would like to estimate how many linking root domains I need to overrank one of my competitor. These are the MOZ data:
    1. My page:
    Page Linking Root domains: 0
    Root Domain Linking Root Domains: 151 2. Competitor:
    Page linking root domains: 1
    Root Domain Linking Root Domains: 5,786 I don't really know on which metric (Page or domain LRD) to rely on in order to make an estimation and I would be glad for some help! To simplyfy the problem, assume that all toher factors (code, on-the-page keyword use., social etc.) are equal for both sites. Can I just get 2LRD to that page in order to likely outrank my competitor or do I need around 5000 more links poiting to my site? I think an answer to this question could help a lot of users here, since I saw similar questions/difficulties regarding the use of page LRD vs. root domain LRD P.S. Non of the pages of my website do currently rank in the top 100 for that keyword.

    | He_Jo
    2

  • Is there away to get a spreadsheet of the pages indexed for a certain domain in google and bing? i.e. I search google for  Site:www.domain.com and I want to export a .csv file of all those domains/pages. Cheers

    | JohnW-UK
    0

  • In Opensiteexplorer it shows 16700 backlinks, but as I tryied various ways including advance report, with no filter I am getting only 1000+ links. How can I download all of my 16700 backlinks?

    | KalpeshBPatel75
    0

  • What exactly does this mean? That the service is down? Or the score will never be available? It seems like I get that and then the tool gets stuck and I get no other data.

    | endlessrange
    0

  • Hi, I have entered a competitiors website www.my-wardrobe.com into Open Site to see who they get links from and to my surprise they have a load from Barclays Business Banking. When I visit the page I can not see the links. But if I search the pages source code for my-wardrobe, there I have it, a link to my-wardrobe.com. How have they done this? Surely Barclays haven't sold them it? And more so, why are they receiving link juice when you cant even see the link on the Barclays page in question - http://www.barclays.co.uk/BusinessBanking/P1242557952664 Thanks | |
    |   | <a <span="">href</a><a <span="">="</a>http://www.my-wardrobe.com" class="popup" title="Link opens in a new window" rel='' onmousedown="dcsMultiTrack('DCS.dcsuri','BusinessBankingfromBarclays/Footer/wwwmywardrobecom', 'WT.ti', '','WT.dl','1');"> |
    |   | www.my-wardrobe.com |
    |   |
    |
    |   | |

    | YNWA
    0

  • I have a client who has a website that is based on a magazine. They make their money through advertisement I am primarily an inbound marketer I would be very grateful if anyone out there has any tips for a site that has been around for quite a while ( over 10 years) we are transforming the site from HTML into WordPress then hosting it with a fast managed WordPress host using CDN. I feel the lack of links is an obvious place to start however if there's anything specific to magazine based websites I would be more than grateful to hear your opinions. Thank you all in advance. Sincerely, Thomas von Zickell

    | BlueprintMarketing
    0

  • I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?

    | SirSud
    1

  • I was just starting to feel like I was getting traction and all of a sudden half of our external backlinks just disappeared! Over the past 4 months we've taken our total external links from about 2500 to 6500 and with the lastest report, we seem to have lost more than half and are now around 2100. Our domain authority has dropped accordingly too.  I realize that backlinks aren't forever but what in the heck could cause you to loose over 4000 in a one month period? eK9w

    | CaliB
    0

  • Does anyone have any suggestions on a good XML Sitemap Generator?  Also interested in best practices and tips for updating the XML Sitemap. I typically have relied on my web developers to do this however it seems that they have not been setting this up with SEO in mind.

    | webestate
    0

  • I have a site that was recently hit by the Google penguin update and dropped a page back. When running the site through seomoz tools, I keep getting duplicate content in the reports for domain.com and domain.com/index.html, even though I have a 301 rewrite condition. When I test the site, domain.com/index.html redirects to domain.com for all directories and root. I don't understand how my index page can still get flagged as duplicate content. I also have a redirect from domain.com to www.domain.com. Is there anything else I need to do or add to my htaccess file? Appreciate any clarification on this.

    | anthonytjm
    0

  • Hi All, I'd like to have entry to the mozcon event - and I wasn't quick enough to sign up. If you have an entry to sell I'd like to buy it. Thank you. Karl Seidel (SEO guppie)

    | karlseidel
    0

  • Hi All Our website identifies a list of search engine spiders so that it does not show them the session ID's when they come to crawl, preventing the search engines thinking there is duplicate content all over the place. The Seomoz has bought a over 20k crawl errors on the dashboard due to session ID's. Could someone please give the details for the Seomoz bot so that we can add it to the  list on the website so when it does come to crawl it won't show it session ID's and give all these crawl errors. Thanks

    | blagger
    1

  • I'm looking for a tool like this: http://www.internetofficer.com/seo-tool/redirect-check/ that can check hundreds/thousands of URLs and give me a report as to which ones have been redirected. Does anyone know of something that can do this?

    | glass01
    0

  • I have a question about calls to the API and how these are measured. I noticed that the URL Metrics calls allow a batch of multiple URLs. We're in a position where we need link data for multiple websites; can we request a single row of data with link information for multiple URLs, or do we need to request a unique row for each URL?

    | ssimburg
    0

  • I have just noticed that my sub-domains are ranking higher in Mozrank and Moztrust than the root domain - that seems nuts. Am I doing something wrong?

    | simonberenyi
    0

  • Hi all, Hoping that one of you Guru's might be able to shed a little light for me please. we launched the online arm of our gold bullion business on the 21st of February and I signed up for an account here on the 23rd of Feb. I don't have a MozRank for my site yet and I'd love to get one. The mozbar that I installed shows o linkes from 0 root domains etc. but google webmaster can see links that are inbound to my site. My questions are: Do I have to wait the 45-60 days that I believe it might take SEOmoz to give me a rank- or is there a process that I manually kick off? Is there anything other than google webmaster that I should be looking at to try and make sure that I am on the right track; I'd hate to go 45-60 days in the wrong direction before realising there is an issue. thanks in advance, YGF

    | YGF
    0

  • Is there a way to see the dates a backlink was acquired? Thanks !

    | tinarose
    1

  • Can someone point me to the official article explaining why meta keywords are no longer taken into account by search engines please? I know Moz has indicated that search engines ignore them, but I would like to read a bit more about it - what was the reason behind it and since when.

    | coremediadesign
    0

  • Hi, I have noticed when using OSE and enter a domain you very often see a higher Page Authority than Domain Authority. If someone could explain why this would happen then I would be very grateful - its my current understanding that page authority would ALWAYS be LESS THAN Domain Authority but that is not always the case (I have seen cases where PA is more than 10 higher then DA) Here's an example where PA > DA http://www.opensiteexplorer.org/links.html?site=www.primelocation.com Thanks

    | James77
    0

  • I've used the "fetch as googlebot" tool in Google webmaster tools to submit links from my site, but I was wondering if there was any type of tool or submission process like this for submitting links from other sites that you do not own? The reason I ask is, I worked for several months to get a website to accept my link as part of their dealer locator tool. The link to my site was published a few months ago, however I don't think google has found it and the reason could be because you have to type in your zip code to get the link to appear. This is the website that I am referencing: http://www.ranchhand.com/dealers.php?zip=78070&radius=20 (my website is www.rangeroffroad.com) Is there any way for Google to index the link? Any ideas?

    | texmeix
    0

  • Hi all, I work for a large retail brand and we have lots of counterfeit sites ranking for our products. Our legal team seizes the websites from the owners who then setup more counterfeit sites and so forth. As soon as we seize control of a website, the site content is deleted and subsequently it falls out of the SERPs to be immediately replaced by the next lot of counterfeit sites. I need to be able to download a copy of the site before it is seized, so that once I have control of it I can put the content back and hopefully quickly regain the SERPs (with an additional 'counterfeit site' notice superimposed on that page in JS). Does anyone know or can recommend good software to be able to download an entire website, so that it can be easily rehosted? Thanks FashionLux (Edited title to reflect only wanting to download html, CSS and images of site. I don't want the sites to actually be functional - only appear the same to Google)

    | FashionLux
    0

  • I'm not finding a clear solution to getting 100 results in the Google SERP. I'm currently using Firefox as I use Chrome for being logged into Google (Gmail, Docs, etc.) I tried the Grease Monkey script, no luck. I tried appending the url with &num=100 (looks like that may have been killed in 2010 which may explain why the GM script isn't working). And it doesn't look like any toolbar, SEOMoz included has that functionality, any ideas or places to look? In addition, does anyone have a good SERP numbering tool? If I do a search and the first page, including Local & Organic has 15 results, when I click over to the 2nd page it starts back at #11, kind of defeats the purpose of using a numbering tool 😉 Any help would be appreciated. Thanks much!

    | nsauser
    0

  • Hi everyone, I'm new here - always loved SEOMoz and glad to be part of the Pro community now. I have 2 questions regarding the Canonical URL tag. Some background info: We used to run an OsCommerce store, and recently migrated to Magento. In doing so, we right away created 301 redirects of the old category pages (OsCommerce) to the new category pages (Magento) via the Magento admin. Example: www.example.com/old-widget-category.html
    301 redicrected to
    www.example.com/new-widget-category.html In Magento admin, we have enabled the Canonical tag for all product and category pages. Here's how Magento sets up the Canonical tag: The URL of interest which we want to rank is:
    www.example.com/new-widget-category.html However Magento sets up the canonical tag on this page to point to:
    www.example.com/old-widget-category.html When using the SEOMoz On Page Report Card, it pick this up as an error because the Canonical tag is pointing to a different URL. However, if we dig a little deeper, we see that the URL being pointed to
    www.example.com/old-widget-category.html
    has a 301 redirect to
    www.example.com/new-widget-category.html
    which is the URL we wan to rank. So because we set up a 301 redirect of the old-page to the new-page, on the new-page the canonical tag points to the old-page. Question 1)
    What are you opinions on this? Do you think this method of setting up the Canonical tag is acceptable? Second question... We use pagination for category pages, so if we have 50 products in one category, we would have 5 pages of 10 products. The URL's would be: www.example.com/new-widget-category.html (which is the SAME as ?p=1)
    www.example.com/new-widget-category.html?p=1
    www.example.com/new-widget-category.html?p=2
    www.example.com/new-widget-category.html?p=3
    www.example.com/new-widget-category.html?p=4
    www.example.com/new-widget-category.html?p=5 Now ALL the URLs above have the canonical tag set as:
    <link rel="canonical" href="http://www.example.com/new-widget-category" /> However, the content of each page (page 1, 2, 3, 4, 5) is different because different products are displayed. So far most what I read regarding the Canonical tag is that it is used for pages that have the same content but different URLs. I would hope that Google would combine the content of all 5 pages and view the result as a single URL www.example.com/new-widget-category Question 2) Is using the canonical tag appropriate in the case described above? Thanks !

    | yacpro13
    0

  • There has been a lot of talk lately around social profiles potentially improving your brand as well as search. What I'd like to know is the best practices for getting those social profiles crawled and indexed so they actually provide a good link to my site. I'm also wondering what the difference between what Linkscape sees and what Google sees and when I'm looking at Open Site Explorer's rankings on one of those social profiles how can I be sure that Google sees it the same way. I ask this because a lot of these profiles are not well internally linked to. An example is about.me, it's a potentially great link, but it's essentially an island, and even after dropping a couple Twitter links to my profile, Open Site Explorer shows and Page Authority of 1, and it's not even indexed with Google. What I did last night was put a link to my about.me, flickr and wedding wire in my Connect menu drop down on my site to get that crawled hopefully soon. Are there other methods of getting those crawled and indexed so it starts passing some juice?
    What do you guys do?

    | WilliamBay
    0

  • The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.

    | loopyal
    0

  • Hi Everyone Two quick questions today 1. How can I find out hat the different colors within the keyword Difficulty Report represent and how can I see examples of how this information can help us with our data analysis? 2. The second question I have is regarding the Term Extractor. Seems when I ran a domain it provided the wrong data. For example, it stated that a certain keyword exists certain number of times within the description and title of the page but when I looked at the source this was not the case so it made studying the competition harder.
    Any suggestions or has anyone else noticed this? Thanks in advance for all your help.

    | DRTBA
    0

  • I know that words in their 20's or 30's would be ideal, but it's proving hard for me to find relevant keywords with such scores (just a couple with scores in the 30's). Is going for words between 40-50 a waste of time? Thanks.

    | ZakGottlieb71
    1

  • Hi Maybe I am missing this but I can''t seem to see it. I am doing some analysis on a client's site and want to get a csv list of links from the client's site to external sites. So what I am looking for is a list of Out Bound Links (OBL) from the client's site. I want to run these past a black list / bad link neighborhood checking script I have. This would actually be a nice feature in SEO Moz Pro, unless it actually already does and I am just missing it or not setting filters correctly. Thanks Trevor

    | tstolber1
    0

  • Hello: I previously asked this question, but I would love to get more perspectives on this issue. In Blogger, there is an archive page and label(s) page(s) created for each main post. Firstly, does Google, esp. considering Blogger is their product, possibly see the archive and tag pages created in addition to the main post as partial duplicate content? The other dilemma is that each of these instances - main post, archive, label(s) - claim to be the canonical. Does anyone have any insight or experience with this issue and Blogger and how Google is treating the partial duplicates and the canonical claims to the same content (even though the archives and label pages are partial?) I do not see anything in Blogger settings that allows altering these settings - in fact, the only choices in Blogger settings are 'Email Posting' and 'Permissions' (could it be that I cannot see the other setting options because I am a guest and not the blog owner?) Thanks so much everyone! PS - I was not able to add the blog as a campaign in SEOmoz Pro, which in and of itself is odd - and which I've never seen before - could this be part of the issue? Are Blogger free blogs not able to be crawled for some reason via SEOmoz Pro?

    | holdtheonion
    0

  • The SEOmoz Site Crawl indicates that we have too many on page links on over 9,970 pages. This is an ecommerce site with a large number of categories. I have a couple of questions regarding this issue: How important is the "too many on page links" factor to SEO? What are some methods of reducing the number of links when there are a large number of categories? We have main categories with dropdown menus currently and have found that they are used to browse and shop the store.

    | afmaury
    1

  • I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page. Does anyone know of a tool that can do this or do we need to create one?

    | Brother22
    0

  • I understand that it is easier to rank for a particular keyword given a higher DA score. How fast can  page authority be established and grown for a given keyword if DA is equal to 10/20/30/50? What are the relative measures that dictate the establishment and growth of this authority? Can it be enumerated to a percentage of domain links? or a percentage of domain links given an assumed C-Block ratio? For example you have a website with DA of 40, and you want to target a new keyword, the average PA of the top ranked pages is 30, the average domain links are 1,000, and the average number of linking domains is 250 - if you aim to build 1,000 links per month from 500 linking domains, how fast can you approximate the establishment of page authority for the keyword?

    | NickEubanks
    0

  • In running one of my campaigns in SEOMoz Pro, it was recommended that I reduce the amount of times a keyword is used to 15.  On the actual page, there are fewer than 15, but when you include the number of times it is used in drop-downs from the nav bar, the number is 53. I know there is really no hard and fast rule about how many instances of a keyword make for keyword stuffing and the drop-downs only use the term where needed.  Without it's use, it would be difficult to navigate the site. Is this a problem or should I focus on more important fixes?

    | rdreich49
    0

  • I just signed up for SEOMoz and sent my site through the first crawl. I use the tilde in my rewritten URLs. This threw my entire site into the Notice section 301 (permanent redirect) since each page redirects to the exact URL with the ~, not the %7e. I find conflicting information on the web - you can use the tilde in more recent coding guidelines where you couldn't in the old. It would be a huge thing to change every page in my site to use an underscore instead of a tilde int he URL. If Google is like SEOMoz and is 301 redirecting every page on the site, then I'll do it, but is it just an SEOMoz thing? I ran my site through Firebug and and all my pages show the 200 response header, not the 301 redirect. Thanks for any help you can provide.

    | fdb
    0

  • The SEOMoz keyword reports show week-to-week changed in keyword positions, but what report can I run to see trends over time so that I can evaluate the effectiveness of our SEO efforts?

    | mhkatz
    0

  • A crawl of my site started on the 8th July & is still going on - is there something wrong???

    | Brian_Worger
    1

  • I've just had a look at the crawl diagnostics and my site comes up with duplicate page content and duplicate titles. I noticed that the url all has %5C at the end which I've never seen before. Does anybody know what that means?

    | Greg80
    0

  • Is having the company/office's address in the footer (or header) of each webpage important for SEO? Your page for the Geotarget tool says that having the address in this element helps search engines find your location. My question is, how important or relevant is this to SEO? How does knowing the address influence SEO? Is it best SEO practice to put the address in the footer of every webpage? http://www.seomoz.org/geotarget

    | richardstrange
    0

  • I have always been under the impression that top level (or root) domains can hold different domain authority than that of a sub domain. Meaning that sub domain's and TLD can hold different ranks and strength in search engine result pages. Is this a correct or just an assumption? If so when i add a root domain and subdomain into the campaign manager i get back the same link information and domain authority? www.datalogic.com
    www.automation.datalogic.com Have I made an incorrect assumption or is this an issue with the SEOMoz campaign manager?

    | kchandler
    0

  • The SEOmoz crawl diagnostic tool is complaining that I'm missing a meta description tag from a file that is an RSS xml file. In my <channel>section I do have a <description>tag. Is this a bug in the SEOmoz tool or do I need to add another tag to satisify the warning?</description></channel>

    | scanlin
    0

  • Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon

    | thegreatpursuit
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.