Can Using Google Analytics Make You More Prone to Deindexation?
-
Hi,
I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case.
Thanks for your thoughts,
Donna
-
Agree with Robert. Also, if you're concerned about Google getting access to your GA data you can always disable sharing it with them from your GA account.
-
Donna,
My guess would be the reasoning behind this is that by virtue of having GA or GWMT, you are one step closer to Google by virtue of the data there. As I think through it, I see a plausibility, but still don't think it is true. If by aggressive you are in the grey to black area, I don't believe having or not having GA will matter. If you go over the top while utilizing Piwik, I don't see how it would be better hidden from Google.
Just IMO,Best,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
Spam signals from old company site are hurting new company site, but we can't undo the redirect.
My client was forced to change its domain name last year (long story). We were largely able to regain our organic rankings via 301-redirects. Recently, the rankings for the new domain have begun to plummet. Nothing specific took place that could have caused any ranking declines on the new site. However, when we analyze links to the OLD site, we are seeing a lot of link spam being built to that old domain over recent weeks and months. We have no idea where these are coming from but they appear to be negatively impacting our new site. We cannot dismantle the redirects as the old site has hundreds, if not thousands, of quality links pointing to it, and many customers are accustomed to going to that home page. So those redirects need to stay in place. We have already disavowed all the spam we have found on the old Search Console. We are continuing to do so as we find new spam links. But what are we supposed to do about this spam negatively impacting our new site? FYI we have not received any messages in the search console.
White Hat / Black Hat SEO | | FPD_NYC1 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
I'm Getting Attacked, What Can I Do?
I recently noticed a jump in my Crawl Errors in Google Webmaster Tools. Upon further investigation I found hundreds of the most spammy web pages I've ever seen pointing to my domain (although all going to 404 errors): http://blurchelsanog1980.blog.com/ http://lenitsky.wordpress.com/ These are all created within the last week. A. What the hell is going on? B. Should I be very concerned? (because they are 404 errors) C. What should my next steps be? Any help would be greatly appreciated.
White Hat / Black Hat SEO | | CleanEdisonInc0 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Is there a problem with google?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates). Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
White Hat / Black Hat SEO | | BobAnderson0 -
Best use of domains with keywords
I own a domain with just the company name in it (no keywords) that I use as main domain. I also own some other domain with keywords inside that right now I redirect all to the main domain with a 301 redirect. What is the best use for these domains? Should I use them when I do link building or is better to use just the main domain? Can they be useful to increase the main domain link juice/page rank? If yes, how? Thanks
White Hat / Black Hat SEO | | darkanweb0