Best META Fields to Include on New Site
-
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using:
My questions to the community are:
- whether or not I've added all pertinent information, and
- if there's anything I'm overlooking
-
Catalyste,
Yes, this is something that I will need to implement. With that, I also wanted to ask you, if we fore-go implementing 'rich snippets' and 'authorship' fields in the BETA is that going to create more problems for development down the line?
More specifically, if we decide to hold off implementing authorship/rich snippets until we launch the actual production site for visitors/customers, will this create more problems than just implementing this stuff now?
Likewise, are there any additional suggestions you'd make for a news organization's site?
-
I would say since Google may show this one in search results which gives you a little control on what users will see when this page is triggered by a Search.
Limit it to 160 characters. Beside that, no that much ...Meta where so abused over the past.
Maybe you should also start to think on implementing Rich snippets if you have time :
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=99170
-
Agree with Shane on meta robots and revisit
SEOMoz recommend that the keywords tag isn't used as it serves no SEO value, all it does is tell competitors the keywords you are trying to rank on
If you are active on Facebook, you might also want to consider using
as it gives analytics for referral traffic
-
Are useless, and I think were always "An Old Wives Tale" that they worked (or at least within the past 4 -5 years give or take) (index, follow is default unless specified noindex, nofollow or by other means robots.txt ect..)
is disregarded by Google, but there are still bots out there that accept i believe
Pragma and site verification are for you to decide -
Pragma just tells the individual browser not to cache, and site verification is just for Google Webmaster Tools - which can be done many ways. (.txt, DNS Zone ect..)
Some others are Author, Publisher and Canonical - but especially in the case of canonical - if you do not do it correctly it can really mess with Bots.
Hope this helps
Extra info on Author and Canonical
http://www.seomoz.org/blog/authorship-google-plus-link-building
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting a single page on a separate domain to a new site?
My client started a subdivision of their company, along with a new website. There was already an individual page about the new product/topic on the main site, but recognizing a growth area they wanted to devote an entire site to the product/topic. Can we/should we redirect that page on the old corporate/main site to the new domain, or just place a link or two? Thoughts?
Technical SEO | | VTDesignWorks0 -
Migrating to New site keywords question
We are converting an old static html ecommerce site to a new platform. The old site has excellent ranking for some of the products. In order to maintain our ranking we will implement 301 redirects from old to new pages (as the urls will change to SEF). I am using Googles Keyword tool (in adwords) and entering each page url of the old site (there are hundreds, I'm doing the top 50 in traffic) and generating a set of keywords, then sorting each list by global searches. For each page, Google's Keyword Tool is giving me hundreds of keywords, but in meta tags there should be no more than 15, so I need a method to choose the keywords on the new page. Question: in the new meta tags should we emphasize the most common keywords (as defined by most global searches) or the least common keywords? I would hate to lose the good ranking for the least common (long tail) keywords.
Technical SEO | | ssaltman0 -
Meta-robots Nofollow
I don't understand Meta-robots Nofollow. Wordpress has my homepage set to this according to SEOMoz tool. Is this really bad?
Technical SEO | | hopkinspat1 -
How is this site doing this?
http://www.meccabingo.com It shows a splash / promotion page yet you check the cache and it's the real homepage, they are doing this so they don't lose rankings but how are they redirecting users to that but Google is caching the real homepage? is it friendly? thanks!!
Technical SEO | | AdiRste0 -
New domain
Hi, I have a domain with no keywords on it, and I´ve been using it for years. Now I bought another domain with the keyword on it. I whant to work on seo for the second domain, with the keyword. What is the better way to work this out? 301? Duplicate de site? redirect in another way?
Technical SEO | | mgfarte0 -
Meta data in includes: not ideal or a problem?
I have pages with meta data being pulled in via an include. This was to prevent people from touching the pages themselves. Is this an optimization issue- or is it OK to do?
Technical SEO | | Tribeca-Marketing-Group0 -
Are lots of links from an external site to non-existant pages on my site harmful?
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site. The site itself has scraped content from elsewhere and has created 100's of malformed URLs. Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact? Thanks!
Technical SEO | | Nobody15569050351140 -
New Sub-domains or New Directories for 10+ Year Domain?
We've got a one-page, 10+ year old domain that has a 65/100 domain authority that gets about 10k page views a day (I'm happy to share the URL but didn't know if that's permitted). The content changes daily (it's a daily bible verse) so most of this question is focused on domain authority, not the content. We're getting ready to provide translations of that daily content in 4 languages. Would it be better to create sub-domains for those translations (same content, different language) or sub-folders? Example: http://cn.example.com
Technical SEO | | ipllc
http://es.example.com
http://ru.example.com or http://example.com/cn
http://example.com/es
http://example.com/ru We're able to do either but want to pick the one that would give the translated version the most authority both now and moving forward. (We definitely don't want to penalize the root domain.) Thanks in advance for your input.0