Best META Fields to Include on New Site
-
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using:
My questions to the community are:
- whether or not I've added all pertinent information, and
- if there's anything I'm overlooking
-
Catalyste,
Yes, this is something that I will need to implement. With that, I also wanted to ask you, if we fore-go implementing 'rich snippets' and 'authorship' fields in the BETA is that going to create more problems for development down the line?
More specifically, if we decide to hold off implementing authorship/rich snippets until we launch the actual production site for visitors/customers, will this create more problems than just implementing this stuff now?
Likewise, are there any additional suggestions you'd make for a news organization's site?
-
I would say since Google may show this one in search results which gives you a little control on what users will see when this page is triggered by a Search.
Limit it to 160 characters. Beside that, no that much ...Meta where so abused over the past.
Maybe you should also start to think on implementing Rich snippets if you have time :
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=99170
-
Agree with Shane on meta robots and revisit
SEOMoz recommend that the keywords tag isn't used as it serves no SEO value, all it does is tell competitors the keywords you are trying to rank on
If you are active on Facebook, you might also want to consider using
as it gives analytics for referral traffic
-
Are useless, and I think were always "An Old Wives Tale" that they worked (or at least within the past 4 -5 years give or take) (index, follow is default unless specified noindex, nofollow or by other means robots.txt ect..)
is disregarded by Google, but there are still bots out there that accept i believe
Pragma and site verification are for you to decide -
Pragma just tells the individual browser not to cache, and site verification is just for Google Webmaster Tools - which can be done many ways. (.txt, DNS Zone ect..)
Some others are Author, Publisher and Canonical - but especially in the case of canonical - if you do not do it correctly it can really mess with Bots.
Hope this helps
Extra info on Author and Canonical
http://www.seomoz.org/blog/authorship-google-plus-link-building
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Just Launched New Site - First Steps to Get it to Rank?
Good Morning Mozzers... We just recently launched a brand new site and now the fun part begins: trying to get it to appear in the SERPS. I'm wondering if you guys can share your best and most proven secrets/tricks to get brand new sites to rank in Google.... For example, what are the first directories you add the site to? What are some links you try and aquire first? Looking for some tips and ideas for a brand new site. Thanks in advance.
Technical SEO | | Prime850 -
Seo For Forum Sites
I have forum site.I've opened it 2 months ago.But there is a problem.Therefore my content is unique , my site's keyword ranking constantly changing..Sometimes my site's ranking drops from first 500.After came to 70s. I didn't make any off page seo to my site.What is the problem ?
Technical SEO | | tutarmi0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
How to remove a thin site penalty
Wondering if anyone could help out. A while back I made an affiliate store using wordpress and merchants products feeds. I didn't get found to adding any unique content to the site and, as was to be expected, I gained a penalty and my search traffic died. A few months back I redesigned the store, still using merchant csv but now with 98% unique content on each page. However, try as I may I still cannot get anywhere in the engines. The domain doesn't even rank for it's own name!! I have submitted reconsideration request but they have replied saying no penalty on the site. The domain is www.digitalcatwalk.co.uk. While the domain isn't massively strong I would prefer not to have to start again as I feel it is a very good domain name. Any advise would be most gratefully received. Thanks Carl
Technical SEO | | GrumpyCarl0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Google showing former meta tags in search results inspite of new tags being crawled by it
I had changed the meta tags for a site www.aztexsodablast.com.au about a month back and Google has also crawled those new tags but in search results when I search for the term 'Aztex Sodablast' it is continuing to show the old tags while on the site, the new tags are being displayed. What may be the issue and how could I correct the problem?
Technical SEO | | pulseseo0 -
Starting a new product, should we use new domain or subdomain
I'm working with a company that has a high page rank on it's main domain and is looking to launch a new business / product offering. They are evaluating either creating a subdomain or launching a brand new domain. In either case, their current site will link contextually to the new site. Is there one method that would be better for SEO than the other? The new business / product is related to the main offering, but may appeal to different / new customers. The new business / product does need it's own homepage and will have a different conversion funnel than the existing business.
Technical SEO | | gallantc0 -
Canonical on ecommerce site
I have read tons of guides about canonical implementaiton but still am confused about how I should best use it. On my site with tens of thousands of urls and thousands of afiiliates and shopping networks sending traffic, is it smart to simply add the tag to every page and redirect to the same url. In doing this would that solve the problem of a single page having many different entrances with different tracking codes? Is there a better way to handle this? Also is there any potential problems with rolling out the tag to all pages if they are simply refrencing themselves in the tag? Thanks in advance.
Technical SEO | | Gordian0