Thank you, but that is not the problem. The file robots.txt is done since a long time ago.
- Home
- elisainteractive
elisainteractive
@elisainteractive
Company: Elisa Interactive S.L.
Latest posts made by elisainteractive
-
RE: Google is indexing blocked content in robots.txt
-
Google is indexing blocked content in robots.txt
Hi,Google is indexing some URLs that i don't want to be indexed and also is indexing the same URLs with https. This URLs are blocked in the file robots.txt.I've tried to block this URLs through Google WebmasterTools but Google doesn't let me do it because this URL are httpsThe file robots.txt is correct so, what can i do to avoid this content to be indexed?
-
RE: Meta Data definition for multiple pages. Potential duplicate content risk?
Thanks for your feedback! I will bears those in mind
I meant page title and description, and unfortunately, defining unique content for these fields is not an option.
Community, do you think Google will penalize the site (duplicate content penalty) if the only difference between snippets is the publication content data?
Thanks in advance!
-
RE: User behaviour reports
thanks for your answer, although I am still a bit lost about new options to start with. Analytics world is huge!
-
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all,
One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following:
Structure 1:
Type of Analysis +Â periodicity + data + brand name
Examples 1:
Monthly Market Analysis, 1/5/2012 - Brand Name
Weekly Technical Analysis, 7/5/2012 - Brand Name
Structure 2:
Company Name + investment recommendation +Â periodicity
Example 2:
Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer)
Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication.
I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future.
My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page.
Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty.
Will this be enough to avoid duplicate content issues?
Thanks in advance for your help folks!
Alex
-
User behaviour reports
Hi there!
we are all aware about the importance of user behaviour in search results. But I am a bit short of ideas in how a good report/analysis of this specific topic would be.
So far, I have managed to do the following:
-
Usability tests and anlysis using tools like crazy egg, in order to check how is the traffic flow and the call to actions to the best products regarding people who come from SEO traffic.
-
GA analysis focused on: bounce rate, page views and traffic flow
Is there anything else you would add to these two items to get the most out of this topic?
thanks a lot in advance!!
-
-
Https redirect
Hi there,
a client of mine is asking me if Google would penalize to redirect from all the http urls to https (they want to change the security protocol).
I assume it is going to work as a classic 301, right? so they might lose some authority in they way, but I am not 100% sure. Can anyone confirm this? does anyone has a similar experience?
thanks a lot!
Looks like your connection to Moz was lost, please wait while we try to reconnect.