How can I track multiple pages using SEOMoz
-
Hi there,
I setup my domain along with 20 keywords to target. One of the first things I learned is not to focus on too many keywords per page. For this reason I decided to create 5 sub pages and link these pages on my index page.
My questions to you:
1. Does SEOMoz track the keyword rankings for the URL entered only?
2. If the answer for 1 is YES, Would I have to setup those 5 extra URL's as new sites along with their 3 to 5 keywords?Thank you,
-
Hi Dennis,
SEOmoz will track rankings for all of the pages crawled within the root domain or subdomain that you have entered for your campaign, so the answer is NO.
If you are seeing different results when you check rankings manually in Google, you need to keep in mind that the SEOmoz Ranking report provides non-personalized results. Also, the way in which embedded local results appear in the SERPs will affect the result. When they are normal results with enhanced local listing they are counted. If they appear in a 6,8, or 12 pack they are not counted. While Google continues to experiment with local listings, the SEOmoz team continues to respond with updates.
Hope that helps,
Sha
PS - If you want to optimize different pages for your keywords using the On-page Analysis tool, you can click "Report Card" in the On-page tool. You will see a selector at the top where you can type in any URL and choose any keyword from your list.
-
Hello
for tracking urls from the index of your website for different keywords you can use the rank tracking tool
http://www.seomoz.org/rank-tracker
write the keyword in the search term, and check your url Entire subdomain
so checked the ranking of each keywordThe second question I have not understood
A new site?
If so, you must create a new campaign, I thinkCiao
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Countries, Same Language: Receiving Duplicate Page & Content Errors
Hello! I have a site that serves three English-speaking countries, and is using subfolders for each country version: United Kingdom: https://site.com/uk/ Canada: https://site.com/ca/ United States & other English-speaking countries: https://site.com/en/ The site displayed is dependent on where the user is located, and users can also change the country version by using a drop-down flag navigation element in the navigation bar. If a user switches versions using the flag, the first URL of the new language version includes a language parameter in the URL, like: https://site.com/uk/blog?language=en-gb In the Moz crawl diagnostics report, this site is getting dinged for lots of duplicate content because the crawler is finding both versions of each country's site, with and without the language parameter. However, the site has rel="canonical" tags set up on both URL versions and none of the URLs containing the "?language=" parameter are getting indexed. So...my questions: 1. Are the Duplicate Title and Content errors found by the Moz crawl diagnostic really an issue? 2. If they are, how can I best clean this up? Additional notes: the site currently has no sitemaps (XML or HTML), and is not yet using the hreflang tag. I intend to create sitemaps for each country version, like: .com/en/sitemap.xml .com/ca/sitemap.xml .com/uk/sitemap.xml I thought about putting a 'nofollow' tag on the flag navigation element, but since no sitemaps are in place I didn't want to accidentally cut off crawler access to alternate versions. Thanks for your help!
Moz Pro | | Allie_Williams0 -
Have a Campaign, but only states 1 page has been crawled by SEOmoz bots. What needs to be done to have all the pages crawled?
We have a campaign running for a client in SEOmoz and only 1 page has been crawled per SEOmoz' data. There are many pages in the site and a new blog with more and more articles posted each month, yet Moz is not crawling anything, aside from maybe the Home page. The odd thing is, Moz is reporting more data on all the other inner pages though for errors, duplicate content, etc... What should we do so all the pages get crawled by Moz? I don't want to delete and start over as we followed all the steps properly when setting up. Thank you for any tips here.
Moz Pro | | WhiteboardCreations0 -
Duplicate page report
We ran a CSV spreadsheet of our crawl diagnostics related to duplicate URLS' after waiting 5 days with no response to how Rogerbot can be made to filter. My IT lead tells me he thinks the label on the spreadsheet is showing “duplicate URLs”, and that is – literally – what the spreadsheet is showing. It thinks that a database ID number is the only valid part of a URL. To replicate: Just filter the spreadsheet for any number that you see on the page. For example, filtering for 1793 gives us the following result: | URL http://truthbook.com/faq/dsp_viewFAQ.cfm?faqID=1793 http://truthbook.com/index.cfm?linkID=1793 http://truthbook.com/index.cfm?linkID=1793&pf=true http://www.truthbook.com/blogs/dsp_viewBlogEntry.cfm?blogentryID=1793 http://www.truthbook.com/index.cfm?linkID=1793 | There are a couple of problems with the above: 1. It gives the www result, as well as the non-www result. 2. It is seeing the print version as a duplicate (&pf=true) but these are blocked from Google via the noindex header tag. 3. It thinks that different sections of the website with the same ID number the same thing (faq / blogs / pages) In short: this particular report tell us nothing at all. I am trying to get a perspective from someone at SEOMoz to determine if he is reading the result correctly or there is something he is missing? Please help. Jim
Moz Pro | | jimmyzig0 -
Question about SEOMoz Pro and Root Domain vs. Subdomain tracking
I currently have two Pro campaigns set up. They are both tracking the root domains of two different e-commerce sites. I also am tracking three competitors for each company, in each campaign. I have those set up by subdomains, like so www.Competitor.com. So in my Historical link analysis I am getting MyRootDomain.com, compared to www.competitor1.com, www.competitor2.com and www.competitor3.com Is this a problem? Would it be better for me to switch my company campaigns to track subdomains too, or to switch my competitor tracking to root domains. This is probably pretty rudimentary, but it never even occurred to me until just now. I realize that if I switch to subdomains for my own company tracking this would necessitate setting up a completely new campaign. This would be a problem because I am maxed out on my 1,000 keywords. Last but not least, does the fact that I have been tracking my own site root domains compared to competitors subdomains mean all of my competitive domain and link analysis is, well, garbage?...because I haven't really been comparing the same things?
Moz Pro | | danatanseo1 -
Is there a tool to upload multiple URLs and gather statistics and page rank?
I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page. Does anyone know of a tool that can do this or do we need to create one?
Moz Pro | | Brother220 -
Redirecting duplicate .asp pages??
Hi all, I have a bit of a problem with duplicate content on our website. The CMS has been creating identical duplicate pages depending on which menu route a user takes to get to a product (i.e. via the side menu button or the top menu bar). Anyway, the web design company we use are sorting it out going forward, and creating 301 redirects on the duplicate pages. My question is, some of the duplicates take two different forms. E.g. for the home page: www.<my domain="">.co.uk
Moz Pro | | gdavies09031977
www..<my domain="">.co.uk/index.html
www.<my domain="">.co.uk/index.asp</my></my></my> Now I understand the 'index.html' page should be redirected, but does the 'index.asp' need to be directed also? What makes this more confusing is when I run the SEOMoz diagnostics report (which brought my attention to the duplicate content issue in the first place - thanks SEOMoz), not all the .asp pages are identified as duplicates. For example, the above 'index.asp' page is identified as a duplicate, but 'contact-us.asp' is not highlighted as a duplicate to 'contact-us.html'? I'm a bit new to all this (I'm not a IT specialist), so any clarification anyone can give would be appreciated. Thanks, Gareth0 -
Yellow Pages
We have just made a yellow pages site n in 3 weeks Google has just indexed 1700 pages out of 18000, so what can we do that Google index all the pages or how the process works? yellowpages.naitazi.com Regards
Moz Pro | | razasaeed0