Best Automated Report?
-
I would like to implement a reporting function to my website to offer a bit of value and information to potential clients.
I am thinking along the lines of a simple input form to include a business name and url. The output would be a clean, branded (my business, url, phone) report that shows opportunities, lowest lying fruit, keywords most prominent, and any errors.
I found this site - http://www.analyticsseo.com/ but, it seems pretty expensive.
Does anyone have any suggestions on another suite that might work?
Many thanks!
-
You could always sign up to SEOMOz and run their reports? Like wise Majestic SEO - they provide great link building profile reports that are easy to shar
Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Difference between Open site Explorer's Root Domain and Basic SERP Report's Linking Root Domain?
Why show different Linking Root Domain open site explorer and SERP of any websites? Open Site explorer show different linking root domain and Basic SERP Report show different linking root domain of any website url, who is the correct and why it is show different linking root domain?
Moz Pro | | surabhi60 -
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
Crawl report - duplicate page title/content issue
When the crawl report is finished, it is saying that there are duplicate content/page titles issues. However there is a canonical tag that is formatted correctly so just wondered if this was a bug or if anyone else was having the same issues? For example, I'm getting a error warning for this page http://www.thegreatgiftcompany.com/categories/categories_travel?sort=name_asc&searchterm=&page=1&layout=table
Moz Pro | | KarlBantleman0 -
After I make corrections of my crawl diagnostics report, how can I tell is those corrections "took". Is there a way to immediatly refresh that report. Will it eventually refresh?'
I have made corrections to the crawl diagnostics report. Can I refresh this report? I would like to see if my corrections were correct. Thanks for your anticipated answer!
Moz Pro | | Bob550 -
Help with URL parameters in the SEOmoz crawl diagnostics Error report
The crawl diagnostics error report is showing tons of duplicate page titles for my pages that have filtering parameters. These parameters are blocked inside Google and Bing webmaster tools. I do I block them within the SEOmoz crawl diagnostics report?
Moz Pro | | SunshineNYC0 -
Plurals and the SEOmoz On Page Report Card
I have some pages on my website that receive an "A" grade with the SEOmoz On Page Report Card if I use a plural term in a phrase (say, "landscape designers in Dallas") but an "F" grade if I use the singular term (say, "landscape designer in Dallas"). Is this a true and accurate reflection of how search engines see the page? Am I going to rank well for plural terms but not for singular, given the specific on-page content I'm using in this example? Or is this an opportunity for SEOmoz to tweak its' algorithm to better reflect the search engines?
Moz Pro | | ahirai1