I think it depends on the source. We typically get a code from Google every 6 months or so, and every now and then I come across different codes from various other services. No problems using them, however if I try and stack codes from the same source / type to the same account they don't work.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Best posts made by donford
-
RE: Is It Possible To Use Multiple Promotional Codes for Google AdWords?
-
RE: What is the best way to find related forums in your industry?
HI Edward,
I'm not sure the "best way" but I can share 3 ways I know of.
Competitor Research
Check your competitors see if they actively participate or simply get links from domains which are forums. Usually a doing a search on their backlink profiles for "blog", "forum" will identify these quickly.Search Engine Search
I sometimes find relative forums simply buying going to Google typing a broad keyword + forums. At this point I have to evaluate the forum to see if it worth trying to engage with.Industry Engagement
If you are actively engaged daily in the industry talk you will pick up on what people are posting. For example following industry leader's Twitter, Facebook, Reddit, Instagram profiles will lead you directly to some post and the places they are referencing.Once you have at least one location worth engaging with, you can also watch those places daily as many times people will re-post information from a similar site / forum. Take for example here.. we oft see info from SearchEngineLand.
Hope this helps,
Don
-
RE: Is there any benefit in using a subdomain redirected to a single page?
There would not be a direct SEO benefit for doing this. There maybe however a benefit in tracking. If you only used that sub-domain for X ad campaign than you would know all traffic from referral sub-domain would be coming from that ad campaign.
There may be some slight non-optimization for doing it this way. Sub-domains are treated as their own domains to a degree, so you are in affect giving the ad-campaign's link to juice to a new domain entirely. Then forwarding that to a specific page. Opposed to just directly giving the link juice an ad campaign can generate to the actual page.
A couple things here depending on the type of ad campaign there may not be any link juice to worry about, like Google's ad words don't pass link juice. However, if you purchased direct advertisement on certain sites you may get some link juice from those ads running.
The second thing is actually a question. What is the purpose of creating a sub-domain to point to a sub directory? Is it just for tracking? Or were you wondering if you could benefit from a sub-domain being treated as a new domain linking to you? If for tracking; I would think there are other tracking methods that could handle referring traffic. If it were in hopes of gaining a new backlink from a different domain than I would say it isn't helpful this way. First because it is simply forwarding to the sub-directory and secondly even it weren't forwarding the link would be considered from the same server and not very helpful anyway.
So in short, no benefit other than a potential way to help with tracking.
Hope that makes sense and helps,
Don
edit some grammar
-
RE: Is there any benefit in using a subdomain redirected to a single page?
Hi David,
Rand covered this very topic in a white board friday. Perhaps you may find it helpful and provide insight on what can happen and why he thinks the way he does.
Hope it helps,
Don
-
RE: Crawled page count in Search console
Hello Bob,
Here is some food for thought. If you disallow a page in Robots.txt, google for example will not crawl that page. That does not however mean they will remove it from the index if it had previously been crawled. It simply treats it as inaccessible and moves on. It will take some time, months before Google finally says, we have no fresh crawls of page x, its time to remove it from the index.
On the other hand if you specifically allow Google to crawl those pages and show a no-index tag on it, Google now has a new directive it can act upon immediately.
So my evaluation of the situation would be to do 1 of 2 things.
1. Remove the disallow from robots and allow Google to crawl the pages again. However, this time use no-index, no-follow tags.
2. Remove the disallow from robots and allow Google to crawl the pages again, but use canonical tags to the main "filter" page to prevent further indexing the specific filter pages.
Which option is best depends on the amount of urls being indexed, a few thousand canonical would be my choice. A few hundred thousand, then no index would make more sense.
Whichever option, you will have to insure Google re-crawls, and then allow them time to re-index appropriately. Not a quick fix, but a fix none the less.
My thoughts and I hope it makes sense,
Don