Concerned About Individual Pages
-
Okay. I've setup a campaign for www.site.com and given a list of keywords. So after the initial crawl we'll have some results.
What I'm looking for tho is how do individual pages on my site rank for the list of keywords given. And then be able to go to a screen in seomoz with data for that particular page with recommendations and stuff like that.
Is this what's going to happen or do I need to create a campaign for each url i want to track?
If all will work as I'd like in the example above, should I then add the second list of keywords that some other pages should rank for?
Will it get to be a big mess or can I relate the keywords to pages in some way?
It seems like what I'm looking for is what this program should be...
Thanks!
-
Hi Dana,
Without knowing anything about the site or the keywords, I would deduce that the url_1 page has best link profile from external sources for KW1...
You will be able to see this by running the three URL's through open site explorer. This will help you understand your offsite optimisation.
I would imagine the url_1 has more links than the other two AND/OR the sources of these links are more relevant (or sites with higher Page / Domain Authority) AND the ratio of keywords in the anchor text vs generic anchor text is better (would aim for 1 keyword back link to 3-4 generics).
At the end of the day, the reason SEO is such a busy and exciting industry is that there are many signals that help a page rank. If there was an exact answer to your question the industry wouldn't exist.
Hope this helps,
Dan
-
I have plugged a few KW's into the tool with a specific url. most give me a D which is kind of what I figured.
So I wanted to check something out and here are my results:
site.com/url_1 KW1 = Grade D & Google rank is page1 pos7
site.com/url_2 KW1 = Grade A & Google rank is nonexistant
site.com/ KW1 = Grade A & Google rank is page2 pos6
These pages have all been alive for about 6 years and Meta stuff never changed page content has slightly over time.
Please help me understand why SEOMOZ Results != Google Results
Thank You
-
For the on page optimization tool it is one keyword per page.
-
one keyword at a time or is there a way to use the research tools -> on page op tool for a list of keywords?
-
You are running a report on your domain not individual pages. Your report will produce a report of errors on all pages, but the basis of the campaign is your domain. If you want to run individual page report on how optimized each page is use the:
research tools -> On-Page Optimization tool
insert the directed keyword and the page you are trying to optimize.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PR2 vs Page Authority 65
Looking at a website that has a Google Toolbar rank of 2. Moz Authority is 65. Why would there be such a huge difference? Site has been around since 2000 but has not been updated. Mosst of the links are from 3000 links all owned by the same person.
Moz Pro | | Ebtec0 -
Adding canonical still returns duplicate pages
According to SEOmoz, several of my campaigns show that I have duplicate pages (SEOmoz Errors). Upon reading more about how to resolve the issue, I followed SEOmoz's suggestion to add rel='canonical' <links>to each page. After the next SEOmoz crawl, the number of SEOmoz Errors related to duplicate pages remained the same and the number of SEOmoz notices shot up indicating that it recognized that I added rel='canonical'.</links> I'm still puzzled as to why the SEOmoz errors did not go down with respect to duplicate page errors after I added rel='canonical', especially since SEOmoz noticed that I added them. Can anyone explain this to me? Thanks,
Moz Pro | | MOZ2
Scott.0 -
Page Authority is the same on every page of my site
I'm analyzing a site and the page authority is the exact same for every page in the site. How can this be since the page authority is supposed to be unique to each page?
Moz Pro | | azjayhawk0 -
Dynamic URL pages in Crawl Diagnostics
The crawl diagnostic has found errors for pages that do not exist within the site. These pages do not appear in the SERPs and are seemingly dynamic URL pages. Most of the URLs that appear are formatted http://mysite.com/keyword,%20_keyword_,%20key_word_/ which appear as dynamic URLs for potential search phrases within the site. The other popular variety among these pages have a URL format of http://mysite.com/tag/keyword/filename.xml?sort=filter which are only generated by a filter utility on the site. These pages comprise about 90% of 401 errors, duplicate page content/title, overly-dynamic URL, missing meta decription tag, etc. Many of the same pages appear for multiple errors/warnings/notices categories. So, why are these pages being received into the crawl test? and how to I stop it to gauge for a better analysis of my site via SEOmoz?
Moz Pro | | Visually0 -
Wild fluctuation in number of pages crawled
I am seeing huge fluctuations in the number of pages discovered the crawl each week. Some weeks the crawl discovers > 10,000 pages and other weeks I am seeing 4-500. So, this week for example I was hoping to see some changes reflected for warnings from last weeks report (which discovered > 10,000 pages). However, the entire crawl this week was 448 pages. The number of pages discovered each week seems to go back and forth between these two extremes. The more accurate count would be nearer the 10,000 mark than the 400 range. Thanks. Mark
Moz Pro | | MarkWill0 -
Can I exclude pages from my Crawl Diagnostics?
Right now my crawl diagnostic information is being skewed because it's including the onsite search from my website. Is there a way to remove certain pages like search from the errors and warnings of the crawl diagnostic? My search pages are coming up as: Long URL Title Element Too Long Missing Meta Description Blocked by meta-robots (Which is how I want it) Rel Canonical Here is what the crawl diagnostic thinks my page URL looks like: website.com/search/gutter%25252525252525252525252525252525252525252525252525252525 252525252525252525252525252525252525252525252525252525252525252 525252525252525252525252525252525252525252525252525252525252525 252525252525252525252525252525252525252525252525252525252525252 52525252525252525252525252525252525252525252525252Bcleaning/ Thank you, Jonathan
Moz Pro | | JonathanGoodman0 -
How fast can page authority be grown
I understand that it is easier to rank for a particular keyword given a higher DA score. How fast can page authority be established and grown for a given keyword if DA is equal to 10/20/30/50? What are the relative measures that dictate the establishment and growth of this authority? Can it be enumerated to a percentage of domain links? or a percentage of domain links given an assumed C-Block ratio? For example you have a website with DA of 40, and you want to target a new keyword, the average PA of the top ranked pages is 30, the average domain links are 1,000, and the average number of linking domains is 250 - if you aim to build 1,000 links per month from 500 linking domains, how fast can you approximate the establishment of page authority for the keyword?
Moz Pro | | NickEubanks0 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0