What automated tools produce a number score for A) On-Page Optimization and B) Domain Exact Match?
-
SEOMoz Community,
-I currently use SEMRush, SEOMoz Open Site Explorer and Archive.org for the other Benefit and Opposition factors.
-I’ve had to manually search pages for keyword use in the title, footer & body for on-page optimization.
-I’ve also manually searched Google for Domain Exact Matches.
Thanks!
-Andrew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Combine poorly ranking pages into a single page?
I'm doing on-page optimizations for an apartment management company, and they have about seven apartments listed on their site. Rather than include everything on the same page - /apartments/apartment-name/ - they have the following setup: /apartments/apartment-name/contact /apartments/apartment-name/features /apartments/apartment-name/availability /apartments/apartment-name/gallery /apartments/apartment-name/neighborhood With very few exceptions, none of these pages appear to rank for anything, and those that do either rank very poorly for seemingly random keywords or for keywords like the apartment complex name (alongside the main landing page for the complex). I'm of the mind to recommend combining the pages into a single one that contains all the info, eliminates the chances for duplicate content (all of the neighborhood pages contain the same content verbatim), and prevents keyword cannibalization. Thoughts? Thanks.
On-Page Optimization | | Alces1 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Confused by Moz page grading tool
Can anyone shed any light on why moz ranks this page an F: http://www.traditional-cleaning.co.uk/cleaning-in-tynemouth.htm for 'cleaners in tynemouth' and 'cleaning in tynemouth' Many thanks!
On-Page Optimization | | EdwardoUK0 -
Optimizing for Bing
My keyword ranking many times vary significantly from google to bing. i feel comfortable with my google rankings but many competitors that i am ranking higher than in google are much higher than my site in bing. Any tips? www.hodgesbadge.com
On-Page Optimization | | GaryQ0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Editing Author Pages
Hi, Quick question regarding author pages. I have the blog set as me for my author page. so the url is: mysite.com/blog/author/miles/ Now, seomoz has picked up that my author page is missing meta description. But, this cannot be editied through wordpress as there is no edit option available. I may really be missing something, but where can I alter the author page, I have a feeling it might be fed from G+ but no really sure what part of G+ is used as the description. Thanks Miles
On-Page Optimization | | easyrider20 -
Duplicate page
Just getting started and had a question regarding one of the reports. It is telling me that I have duplicate pages but I'm not sure how to resolve that.
On-Page Optimization | | KeylimeSocial0 -
Cross-domain canonical
HI, We've got a German e-commerce site on an .at domain and would like to have a copy on a .de domain as we expect higher conversions for German users there. The idea now would be to make use of the cross-domain canonical tag, create a "duplicate" on the .de domain and add a canonical tag on all sites and refer to the original content on the .at domain. That would mean the .de won't rank, but German users could see the .de domain, Austrian users the .at domain in the address bar and everybody could feel "at home" ... that's the theory. What do you guys think? Valid strategy?
On-Page Optimization | | gmellak0