Unsolved How many sites can I track with one subscription?
-
Hello,
We are currently a MozPro medium member and we are tracking amlrightsource.com but we have other sites we'd like to track as well. Wondering if we can track more sites with this subscription?
-
The short answer is - you can track up to 10 websites on your Medium subscription, @kassandrasharr .
-
@kassandrasharr
Hi Kassandra!
You can compare our available Moz Pro plans pricing and features here (a Campaign = a site you can track)
If you're unsure what plan you're currently on you can check through your Moz account
If you're hitting any limits you can also purchase subscription addons. We've got a guide to purchasing more allowances for your Moz Pro plan hereIf you'd like any further guidance please reach out to our help team https://moz.com/help/contact
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved 403 errors for assets which work fine
Hi,
Moz Tools | | Skites2
I am facing some issue with our moz pro account
We have images stored in a s3 buckets eg: https://assets2.hangrr.com/v7/s3/product/151/beige-derby-cotton-suit-mb-2.jpg
Hundreds of such images show up in link opportunities - Top pages tool - As 403 ... But all these images work fine and show status 200. Can't seem to solve this. Thanks.0 -
**When will MOZ support GA4?**
Google has announced that it will not report new "Universal Analytics" data as of July 1, 2023. That would mean, at that time, MOZ "traffic" results will apparently become irrelevant?
Reporting & Analytics | | dcmike0 -
Unsolved Log in with Safari
Hello in the last month or so I seem to no longer be able to log into Moz using Safari 15 on my Mac. I have content blockers disabled (though that shouldn't be a requirement anyway). Please fix!
Product Support | | duncanwilcox0 -
How can check location search in Google
Hi Guys, How can check location wise results in Google. Can you please help me. Thanks, Akhilesh
Moz Pro | | dotlineseo1 -
Can you explain social indicators in Open Site Explorer
My Google +'s for my domain were 403 last month and 35 this month.... Based on this I'm assuming OSE's numbers are not cumulative. Are they based on the previous month or a different time period? Any explanations would be helpful. Thanks!
Moz Pro | | theLotter0 -
"Too many on page links" phantom penalty? What about big sites?
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
Moz Pro | | kida12meyer0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090