Archiving Campaigns in SEOmoz
-
First off, I love the campaign archive feature. Very useful for my purposes.
My question is: Is there a limit to how many campaigns I can archive? Thanks in advance!
-
No problem.
-
That's the same boat I'm in. Glad to know I won't hit a wall. Thanks!
-
Hi Colin,
No you can archive as many as you like. What I do sometimes, is if I want to dig a bit deeper into some of my accounts then I will make them active for a few weeks, then archive them and make a different one active. This is because I don't need to track all of my campaigns at the same time, but I like to keep a hold of the historical data.
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What keywords should I use in my campaign.
I am a chiropractor. I only compete against web sites in my town. Should my key words be neck pain, back pain? Or should it be neck pain Bloomington Illinois and low back pain Bloomington Illinois? I am referring to the campaigns I just started. (I am new)
Moz Pro | | Bob550 -
How best is it to use the on-page reports in seomoz?
how best is it to use the on-page reports in seomoz? Any help and techniques people use would be greatly appreciated thanks
Moz Pro | | Bristolweb0 -
Is there a keyword suggestion tool available in the SEOMOZ suite of tools?
Is there a keyword suggestion tool available in the SEOMOZ suite of tools that is similar to semrush.com? semrush allows you to put in a URL and then will tell you what keywords you rank for. Looking for a good tool that is similar.
Moz Pro | | webestate0 -
Why does SEOMoz crawler ignore robots.txt?
The SEOMoz crawler ignores robots.txt It also "indexes" pages marked as noindex. That means it is filling up the reports with things that don't matter. Is there any way to stop it doing that?
Moz Pro | | loopyal0 -
Can I do a campaign for just a page?
We've been doing a lot of building and work on just one category page, but when i try to put it in the campaign it won't let me do any url that has a sub folder like www.mainsite.com/keyword-page. I can only do www.mainsite.com, and when i select the other campaign options like root domain or sub folder, roger pops up with an error. Is anyone else having this problem?
Moz Pro | | anchorwave0 -
Discrepancies in PA and LRDs reported in different SEOmoz tools
I've noticed a difference in the reported PA and LRD numbers for URLs depending on whether you use Open Site Explorer, or look at the same metrics from within the rankings history (in your campaign set up). I've checked this for a few URLs and what I'm seeing is the reported scores for PA and LRDs is different 9 times out of ten. The PA is sometiomes higher on one report, lower on another, or vice versa. Same for LRDs. I thought it might be because one report was lagging behind and using old data, but that would only make sense if I was seeing an increase in reported LRDs, but it just as often shows a decrease ! Is this just a bug in the campaign>rankings history report or is there a reason for the discrepancies?
Moz Pro | | Websensejim0 -
SEOmoz Bot indexing JSON as content
Hello, We have a bunch of pages that contain local JSON we use to display a slideshow. This JSON has a bunch of<a links="" in="" it. <="" p=""></a> <a links="" in="" it. <="" p="">For some reason, these</a><a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p=""></a> <a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p="">One example page this is happening on is: http://www.trendhunter.com/trends/a2591-simplifies-product-logos . Searching for the string '<a' yields="" 1100+="" results="" (all="" of="" which="" are="" recognized="" as="" links="" for="" that="" page="" in="" seomoz),="" however,="" ~980="" these="" json="" code="" and="" not="" actual="" on="" the="" page.="" this="" leads="" to="" a="" lot="" invalid="" our="" site,="" super="" inflated="" count="" on-page="" page. <="" span=""></a'></a> <a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p="">Is this a bug in the SEOMoz bot? and if not, does google work the same way?</a>
Moz Pro | | trendhunter-1598370 -
Seomoz Spider/Bot Details
Hi All Our website identifies a list of search engine spiders so that it does not show them the session ID's when they come to crawl, preventing the search engines thinking there is duplicate content all over the place. The Seomoz has bought a over 20k crawl errors on the dashboard due to session ID's. Could someone please give the details for the Seomoz bot so that we can add it to the list on the website so when it does come to crawl it won't show it session ID's and give all these crawl errors. Thanks
Moz Pro | | blagger1