Is Tag Manager a good option to insert text in websites?
When a website doesn't have an administration panel adding text is a very big problem.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Is Tag Manager a good option to insert text in websites?
When a website doesn't have an administration panel adding text is a very big problem.
Hi there,
We operate about a dozen websites that are part of the same parent company. All these websites are in the same vertical, and link among each other in the footer section. The anchor for each link is the brand name, so we are merely doing it to show what other products the parent company has to offer. My question is: do you think this is still advisable, or it might trigger a penalty? Also, what would you choose between:
Thanks.
It seems that the Sudden drop in indexed pages reported in WMT might relate to some reporting issues from Google - https://productforums.google.com/forum/#!topic/webmasters/qkvudy6VqnM;context-place=topicsearchin/webmasters/sitemap|sort:date
Since "teracent-feed-processing" didn't followed the rules in robots.txt, we had to hard-block it. If server detects the user agent beeing "teracent-feed-processing" it will drop the connection: _ (104) Connection reset by peer_
It seems that "teracent-feed-processing" user agent is somehow linked to Google. If you analyse the Ip's , you'll noticed that are Google owned. Teracent company has been bought by Google in 2009.
btw - we've already blocked it, but I'm trying to figure it out what's the key role played by this user agent. We've also noticed a drastic decline in number of pages being reported in Google Webmaster Tools (half of what we used to have). Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent?
Does anyone knows some info about "teracent-feed-processing" user agent?
IP's from which user agent reside: 74.125.113.145, 74.125.113.148, 74.125.187.84 ....
In our logs, 2 out of 3 requests are made by it, causing server crash.
I'll keep it short:
1. Doing bulk redirects is bad, because you will not have relevancy between your links.
2. Redirect your old homepage to the page that is the most relevant to that on your new site. It can be the homepage on your new site or maybe it is a product page.
3. Redirect them to the most relevant pages.
Hi guys,
We are investigating the First Click Free implementation and we're not sure if a certain scenario follows Google's guidelines or not (aka can it be considered cloaking?!). Now the question is, should we allow this user to see 1 page after every Google search or it should it be 1 page / session.
We have these steps:
1. Let's suppose that we have John searching for an address he clicks on our link and lands on the page. From now on if he goes further he will be asked to register.
2. He closes the site but the session is still active. He searches for another address and clicks on our link and again he lands on our website.
At this point should he be able to see the page or we can prompt him with the registration request?
Thoughts?
Thanks
Does international link building make sense for US websites (.com domain)? The links we could acquire are from websites in the same niche, but I read on a forum that Google will disregard (devalue ?) international links.
There is work involved in getting such a link, so it's quite important to find out whether they are worth the trouble.
We're trying to make sense of Google's new parameter handling options and I seem unable to find a good answer to an issue regarding the NoUrl option.
For ex. we have two Urls pointing to the same content:
Ideally, I would want Google to index only the main Url without any parameters, so http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map
To do this, I would set the value No Urls for the zoom, x and y parameters. By doing this do we still get any SEO value from back links that point to the URLs with the parameters, or will Google just ignore them?
We did not get more details on that - I assume it may raise some flags if you suddenly redirect 20 domains. I'm not sure if or why is that, that's why I decided to ask a follow-up question.
Btw, thank you both for your replies!
This is a follow-up question from one posted earlier this month. I can't linked to that because it's a private question so I'm trying to summarize it below.
We have a number of domains – about 20 - (e.g. www.propertysharp.com) that point to our main domain ip adress (www.propertyshark.com) and share the same content. This is no black-hat strategy whatsoever, the domains were acquired several years ago in order to help people who mistyped the websites url to reach their desired destination.
The question was whether to redirect them to our main domain or not. Pros were the reportedly millions of incoming links from these domains - cons was the fact that lots of issues regarding duplicate content could arise and we actually saw lots of some pages from these domains ranking in the search engines.
We were recommended to redirect them, but to take it gradually. I have a simple question - what does gradually mean - one domain per week, per month?