so if content is identical should I edit the meta tag to be identical?
I have example.com/page & m.example.com/page identical in content.
the mobile version does not include title meta tag at all.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
so if content is identical should I edit the meta tag to be identical?
I have example.com/page & m.example.com/page identical in content.
the mobile version does not include title meta tag at all.
Should mobile pages get their own unique meta title?
The meta I see on mobile search results is pulled from my desktop page and the mobile page does not include the title meta tag.
Thanks Jeff and Andy,
We are already building compelling content focused in those specific states. I was also wondering if there is anything in the technical side to leverage position, but local SEO and schema doesnt really apply to us, since it is usually based on a physical address which we dont have it.
Building a landing pages targeting the state would be an option but we do want the user to arrive in the homepage which is not really geo targeted.
Ill try a few thing for the next few weeks and I would be happy to share some results.
Thanks a lot,
A
Hey Guys,
Does anyone have experience or can point me to the right documentation about geo targeting possibilities for specific states in the US or specific areas in the world.
Local SEO does not apply in my case, since my website is not a business nor have a physical address.
My website offers information that is only relevant for specific states in the US, how can I leverage my I optimisation to gain more exposure in those specific states?
I really appreciate any help.
A
Sure, not only still apply but internal link architecture is one of the main important on page optimization for a site.
The idea is that by internal links you are telling google which pages are the most important in your site. You can even check your most internal linked pages in WBT.
It is recommended to create links to your category pages in a natural way that increases UX.
I hope it helps!
Hey Guys,I was wondering how Panda behaves with news publisher sites.A site with +-1M visits a day that publishes +-300 news articles a day and the life of each article is one week top, given the nature of a news articles -->only relevant now.After one week the the news articles have virtually no page views. This results on a site with thousands of quality content pages that has no page views for years.Is it possible that the site gets penalized by panda for having thousands of pages with no visits?
Hey Guys
I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method.
Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index.
Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option)
Thanks Guys and I appreciate any help!
Hey Guys,
This might be an over spoken topic but I havent been able to find a proper answer or solution for this issue. So far, the closest solution I came across is this article --> http://moz.com/blog/decoding-googles-referral-string-or-how-i-survived-secure-search
I would like to know if there is any other solution for this matter.
I have been searching the web for quite while now and I havent found a reasonable solution to segmenting the traffic from google news one box.News onebox traffic is counted as organic/google in GA.Is there a way to measure the traffic coming from the onebox news??Please note that news.google.com is not the same thing.I really appreciate any help.
Thanks Anthony,
Your explanation was very helpful.
Assuming that 3 millions pages out of my 5 are not so important for google to be crawling or indexing.
What would be the best way to optimize my crawl efficiency in relation to the amount of pages?
Just <noindex>3 million pages on the site, I believe this can be a risk move.</noindex>
Perhaps robots.txt but that would not de-index the existing pages.
Yes, the canonical should point to the original page. If you have 2 similar pages the canonical of the second page should point to the first page.
I agree with Martin that the rel=next and rel=next is a good solution for pagination as well.
Good luck
Sure, not only still apply but internal link architecture is one of the main important on page optimization for a site.
The idea is that by internal links you are telling google which pages are the most important in your site. You can even check your most internal linked pages in WBT.
It is recommended to create links to your category pages in a natural way that increases UX.
I hope it helps!
Thanks Jeff and Andy,
We are already building compelling content focused in those specific states. I was also wondering if there is anything in the technical side to leverage position, but local SEO and schema doesnt really apply to us, since it is usually based on a physical address which we dont have it.
Building a landing pages targeting the state would be an option but we do want the user to arrive in the homepage which is not really geo targeted.
Ill try a few thing for the next few weeks and I would be happy to share some results.
Thanks a lot,
A
Looks like your connection to Moz was lost, please wait while we try to reconnect.