Correct Way to Write Meta
-
OK so this is a really, really basic question. However, I'm seeing some meta written differently to normal and I'm wondering if a) this is correct and b) whether there is any benefit.
Normally it's like this:
However, I am seeing it written like this is some places:
So, the content= and name= are swapped around. I assume the people that did this were thinking that bringing the content forward would mean that Google reads keywords first.
Just wondering if anybody knows whether this is good practice or not? Just spiked my interest so apologies for the basic nature of the question!
-
Thanks guys - much appreciated. Personally I didn't think it mattered which way it went round, however just interested to hear if there was a good reason for this. The a href= example is a good example of how it shouldn't really matter.
-
As long as all the informations are there, I don't think the order matters.
Same thing for a link, you can have
And put the target or the title before the href, the link would still work fine.
-
Hi,
In regards to google, I think the best source of determining this would be Google
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=79812
To answer your question though, Google states it should be
There could be other search engines who prefer it the other way, but switching it around in my opinion would not help with rankings or keywords anymore than
the
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does this type of writing follow the "original content" criterion of structured data?
Hi!' So, in Google's general guideline for structured data, it's stated that the webmasters must "provide original content that you or your users have generated." If I were to write an article about post similar to stuff like "how to get a driver's license" or "how to apply for an accounting license", which requires looking up information from official and non-official sources. After researching, I compiled the information I found and wrote a few blog posts. Are these considered original content? Can I apply structured data to these posts without Google penalizing them? Thanks!
Technical SEO | | EverettChen0 -
Site Hack In Meta Description
Hey MOZ Community, I am looking for some help in identifying where the following meta description is coming from on this home page - https://www.apins.com. I have scrubbed through the page source without being able to locate where the content is being pulled from. The website is built on WordPress and metas were updated using Yoast, but I am wondering if an installed plugin could be the culprit. On top of this, I have had a developer take a look for the "hack" and they have assured that the issue has been removed. I have submitted the URL in GSC a couple of times to be re-indexed but have not had much luck. Any thoughts would be much appreciated, the displayed description is below. The health screening plays http://buyviagraonlineccm.com/ a significant and key role in detecting potentially life-threatening illnesses such as cancer, heart ...
Technical SEO | | jordankremer0 -
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
Hello all, I am Greek and I have a quite strange question for you. Greek characters are generally recognized as special characters and need to have UTF-8 encoding. The question is about the URLs of Greek websites. According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages. To give you an example let's look at A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks. On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)! So the question is: For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ? Thank you very much for your help! Regards, Lenia
Technical SEO | | tevag0 -
Best way to retain banklink values when moving site?
Hi all, I want to get some opinions on what the best practice is when transferring backlink values from an old site to a new one. On the old site, I currently have a product page and this particular product has multiple models all listed on the one singe page. However on the new site, every model of this particular product has its own page. These product model pages would have relatively similar content apart from several key details which differentiates the models. Firstly would you guys recommend this splitting of models of the same product to different pages? If so, my initial thought process is to 301 redirect the old product page to the new model page that is most popular, and adding rel canonical tags to the other model pages. Would you consider this best practice? Or are there better ways I can be doing this to retain backlink values without also getting penalised due to possible content duplication? Thanks! Jac - sent from my manager's account.
Technical SEO | | RuchirP0 -
Duplicate Meta Descriptions From Pages That Don't Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Technical SEO | | southcoasthost0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Would you move the site to a different host or change packages at a significant expense in order to eliminate the meta refresh
When I began working with a site (http://www.visix.com) , I discovered a number of hosting constraints that hampered some SEO related changes I wanted to make. A year later, the site was teetering on the 1st page for a particular keyword of choice and when the Panda & Penguin updates happened, the site got passed by 3M & Amazon, both much bigger sites. (was #11, now #13) Now I'm thinking I should try and use the homepage to rank for keyword "digital signage software", where originally I was making progress with an inner page. Now I am revisting the homepage meta refresh and need to decide if it is enough of an issue to warrant a hosting change. http://www.visix.com has a meta-refresh "0" seconds to http://www.visix.com/index.aspx I know sites can rank well with these, although I don't know the level of handicap that it has. In an article here, http://www.seomoz.org/learn-seo/redirection there is a statement saying that a meta-refresh will not pass as much link juice as a 301 redirect. I have read about every opinion I can find, and would appreciate other's opinions on the matter. The host is Network Solutions and the hosting package does not allow 301 redirects, among other things. Would you move the site to a different host or change packages at a significant expense in order to eliminate the meta refresh or is it not a big deal on a well established site? Thanks very much for your feedback!
Technical SEO | | IntegralOCR30 -
Best way to address duplicate news sections within site
A client has a news section at www.clientsite.com/news and also at subdomain.clientsite.com/news. The stories within each section are identical: www.clientsite.com/news/story-11-5-2011 subdomain.clientsite.com/news/story-11-5-2011 What's the best way to avoid a duplicate content issue within the site? A 301 redirect doesn't seem appropriate from the user experience point of view. Is applying a rel=canonical <www.clientsite.com news="" story-a-b-c="">to each story within the subdomain news section the best option? They have 100's of stories, wondering if there might be an easier way?</www.clientsite.com> Also, the news pages list the story headline and the first 3 lines of copy. Do these summaries present duplicate content issues with the full story page? Thank you!
Technical SEO | | 540SEO0