Why isn't Moz recognizing meta description tags using SLIM?
-
Hey All,
I keep getting reports from Moz that many of my pages are missing meta description tags. We use SLIM for our website, and I'm wondering if anyone else has had the same issue getting Moz to recognize that the meta descriptions exist.
We have a default layout that we incorporate into every page on our site. In the head of that layout, we've included our meta description parameters:
meta description ='#{current_page.data.description}'
Then each page has its own description, which is recognized by the source code
http://fast.customer.io/s/viewsourcelocalhost4567_20140519_154013_20140519_154149.png
Any ideas why Moz still isn't recognizing that we have meta descriptions?
-Nora,
-
After you fix it you can ask google to fetch the page in WMT if you need a sooner update but if you wait they will re craw it fast too.
-
Thanks Spencer!
Still have to wait for next week's crawl to be sure, but that seems to have fixed it
-
It's because your meta descriptions are improperly formatted.
Yours are formatted like:
description="We make it easy to send emails triggered by user behavior. Build, measure and improve your emails to activate and retain users" />
but they should be like:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Penguin 2.1 Penalty- Can't Understand why hit by it?
Hi, I have lost all my rankings after Penguin 2.1 update. I haven't did anything wrong. Want to know the root cause of the penalty so that I can overcome this. Any help would be appreciated. Website: http://tiny.cc/hfom4w
White Hat / Black Hat SEO | | chandman0 -
Macrae's Blue Book Directory LIsting
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
White Hat / Black Hat SEO | | EcomLkwd0 -
New online store and use black hat to bring lots of sales
I have one online store and all the seo rules are follow to increase ranking and sales. Buying a new url a launching a new store ( to sale exactly the same products) is fast, easy and cheap. How about using black hat to this new store? I think I have nothing to loose. Is there something I should know before moving ahead? Launching a new store is very cheap and black hat can be done by one of those overseas company at low prices First thing, this new store should not link to my actual store I guess. Any advice? Thank you, BigBlaze
White Hat / Black Hat SEO | | BigBlaze2050 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
It Shows as "google results" but it's an incoming links, is it spaming me...?
Hello everyone I have 2 issues to share: 1) We have a site (personal-loans.org), In the past few weeks we notice that there are sites that have links to our site and we get traffic from them but...! when you go online to these sites they show you that all they do is provide "google search" results, because we where in first page on the results we had hits there as well what leads me to think that this is the reason we are at page 7 now after yesterday the ranking was at page 4. these are some of these sites so you can see it: internetpayadvances.com fastlivecashadvance.com assistancemoney.com scoutcashnow.com officialpayday.net Does anyone else got to see anything like that...??? I have many more links like that, these are only 5 out of 9 that had hits yesterday only, site traffic went from 250-300 to 63 a day... For the same site - it was on google search results 1st page and ranked 4-7, even after the big penguin changes. What we did notice is that A LOT of non related sites like surfing (yes ocean surfing) and sites that had no content AT ALL - all the text was inside of an image and ranked 3! 3rd on payday loans search result. (and the rest was and still just looks the same with different content...) Google say they want quality but does not do homework for the 2nd largest search for keywords such as loans and payday loans market, same goes for the cash advance. Please help, need your advice.... Thanks
White Hat / Black Hat SEO | | Yonnir0 -
Dramatic fall in SERP's for all keywords at end of March 2012?? Help!
Hi, Our website www.photoworld.co.uk has been improving it's SERP's for the last 12 months or so, achieving page 1 rankings for most of our key terms. Then suddenly, around the end of March, we suffered massive drops in nearly all of our key terms (see attached image for more info). Basically I wondered if anyone had any clues on what Google has suddenly taken a huge dislike to with our site and steps we can put in place to aid with rankings recovery ASAP. Thanks n8taO.jpg
White Hat / Black Hat SEO | | cewe0 -
Is a directory like this white hat? Useful?
This is one of my competitor's backlinks: http://bit.ly/mMPhmn Prices for inclusion on this page go from $50 for 6 months to $300 for a permanent listing. Do most of you guys do paid directories like this for your SEO Clients? My gut is telling me to run away...but I don't want to miss a good opportunity if I should be taking it.
White Hat / Black Hat SEO | | MarieHaynes0