Best META Fields to Include on New Site
-
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using:
My questions to the community are:
- whether or not I've added all pertinent information, and
- if there's anything I'm overlooking
-
Catalyste,
Yes, this is something that I will need to implement. With that, I also wanted to ask you, if we fore-go implementing 'rich snippets' and 'authorship' fields in the BETA is that going to create more problems for development down the line?
More specifically, if we decide to hold off implementing authorship/rich snippets until we launch the actual production site for visitors/customers, will this create more problems than just implementing this stuff now?
Likewise, are there any additional suggestions you'd make for a news organization's site?
-
I would say since Google may show this one in search results which gives you a little control on what users will see when this page is triggered by a Search.
Limit it to 160 characters. Beside that, no that much ...Meta where so abused over the past.
Maybe you should also start to think on implementing Rich snippets if you have time :
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=99170
-
Agree with Shane on meta robots and revisit
SEOMoz recommend that the keywords tag isn't used as it serves no SEO value, all it does is tell competitors the keywords you are trying to rank on
If you are active on Facebook, you might also want to consider using
as it gives analytics for referral traffic
-
Are useless, and I think were always "An Old Wives Tale" that they worked (or at least within the past 4 -5 years give or take) (index, follow is default unless specified noindex, nofollow or by other means robots.txt ect..)
is disregarded by Google, but there are still bots out there that accept i believe
Pragma and site verification are for you to decide -
Pragma just tells the individual browser not to cache, and site verification is just for Google Webmaster Tools - which can be done many ways. (.txt, DNS Zone ect..)
Some others are Author, Publisher and Canonical - but especially in the case of canonical - if you do not do it correctly it can really mess with Bots.
Hope this helps
Extra info on Author and Canonical
http://www.seomoz.org/blog/authorship-google-plus-link-building
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Domain Redirect And Old Domain to a New one including pages
Hi, I need to 301 an old domain to a new one (new website) I need to 301 the domain to a new page not the new domain direct for example www.olddomain.co.uk to www.newdomain.co.uk/pagenew Then I need to also 301 all the other pages on the old domain to the new one for example... www.oldmain.co.uk/oldpage to www.newdomain.co.uk/newpage Issue is I can do one or the other not both, I can get the other pages to redirect but then the main domain wont redirect to the correct new page. Or I can get the old domain to redirect but not the internal pages. Thanks
Technical SEO | | David-Sharpe0 -
Moving site from html to Wordpress site: Should I port all old pages and redirect?
Any help would be appreciated. I am porting an old legacy .html site, which has about 500,000 visitors/month and over 10,000 pages to a new custom Wordpress site with a responsive design (long overdue, of course) that has been written and only needs a few finishing touches, and which includes many database features to generate new pages that did not previously exist. My questions are: Should I bother to port over older pages that are "thin" and have no incoming links, such that reworking them would take time away from the need to port quickly? I will be restructuring the legacy URLs to be lean and clean, so 301 redirects will be necessary. I know that there will be link juice loss, but how long does it usually take for the redirects to "take hold?" I will be moving to https at the same time to avoid yet another porting issue. Many thanks for any advice and opinions as I embark on this massive data entry project.
Technical SEO | | gheh20130 -
Why my site is not ranking for any of the keywords?
We have a site for Property management software, we have done everything like set proper Title and descriptions, heading tags, robots tag is also ok, set schema and its ok with Google webmaster too also we are doing Social media promotion. can you please check our website and tell me what is the problem??
Technical SEO | | rootwaysinc0 -
Similar pages on a site
Hi I think it was at BrightonSEO where PI DataMetrics were talking about similar pages on a website can cause rankings to drop for your main page. This has got me thinking. if we have a category about jumpers so: example.com/jumpers but then our blog has a category about jumpers, where we write all about jumpers etc which creates a category page example.com/blog/category/jumpers, so these blog category pages have no index put on them to stop them ranking in Google? Thanks in Advance for any tips. Andy
Technical SEO | | Andy-Halliday1 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1 -
What are the best Free Press release sites to gain free links
Hi I am trying to find some good free press release sites that allow you to have a link with your press release to help drive traffic to your site but all the ones that i have found do not have links within them. The only ones i can find where you can have links in them are paid ones. Does anyone use press release sites to gain links to their sites and if so could you let me know which ones they are please and how important you feel they are.
Technical SEO | | ClaireH-1848860 -
Buying a new domain
Hello guys! We are in process of buying a new domain. How can we be sure that this domain is not blacklisted and are there any steps to take in order to be sure that whatever we are buying is actually in "good shape"? Thanks much!
Technical SEO | | echo10