What is your experience so far, with the new Google's Meta Description length up to 320 characters?
-
I updated a few home pages and some landing pages, so far so good!
Although, I wish to know about other experiences, before continue updating. Thanks for your comments!
-
Hi Brooks,
Thanks for the input. It is great to know that it is also working in your ecosystem.
-
Agreed, and sometimes there can be power in a one word title - both for optimization and communication.
-
Agreed.
I don't have any data to prove it's usefulness, but there's something really nice and satisfying about a solid, short, and effective meta description.
-
"really just didn't need to be any longer than 160ish characters"
Thanks!
That's a very interesting thought. Especially if you can do effective "short and punchy" descriptions. If you ramble too long it stinks it up.
-
My agency just launched a new website and in the process updated many of our meta descriptions. Though in optimizing, we realized some really just didn't need to be any longer than 160ish characters.
However, for other pages, such as our services pages, the additional characters gave us a chance to introduce the page, detail what the user can find, and sort of "preview" the call to action.
We've already seen a little increase in CTR for some of our services pages.
Best of luck!
-
Hello William,
Thanks for the input.
I noticed that sometimes Google chooses text randomly. Till now I cannot find a pattern, sometimes from the first paragraph, others from the middle of the page. Although, regarding the pages already SEO optimized, I mean with adecuate page title, meta description and h1, it is showing the written meta description.Best regards
-
Hello,
I am in the process of updating Meta tags and top of page content to try and get relevant text and description tags to show in Google listings. On some of my pages Google uses my Mets "Description" tags and on many others, Google is using the content from the top of the pages. I am not sure how or why which one gets used so I am working on both tags and top of page content.
Best Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible that Google is pulling description from third party websites and displaying in the description section in organic result?
Hi all, I have come across the most weird situation ever in my SEO career. Google is displaying description in organic results for brand term under the website URL that doesnt exist on the website ANYWHERE but this description does appear on some directory sites created back in 2002 or so. Is there a possibility that Google is pulling info from directory sites and displaying as a description in the organic results? I am super confused! Help needed! Thanks
Intermediate & Advanced SEO | | Malika10 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Pull meta descriptions from a website that isn't live anymore
Hi all, we moved a website over to Wordpress 2 months ago. It was using .cfm before, so all of the URLs have changed. We implemented 301 redirects for each page, but we weren't able to copy over any of the meta descriptions. We have an export file which has all of the old web pages. Is there a tool that would allow us to upload the old pages and extract the meta descriptions so that we can get them onto the new website? We use the Yoast SEO plugin which has a bulk meta descriptions editor, so I'm assuming that the easiest/most effective way would be to find a tool that generates some sort of .csv or excel file that we can just copy and paste? Any feedback/suggestions would be awesome, thanks!
Intermediate & Advanced SEO | | georgetsn0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
How is Google's algorithm evolving in terms of DA vs PA value?
how is Google evolving in terms of value for DA vs PA? Is having a link from a DA 75 + PA 25 better than having a link from a DA 50 + PA 50, assuming such 2 websites are otherwise identical? I have a couple of .EDU backlinks where DA is around 80, though PA 1. Would be DA 40 with a PA 40 be more valuable? I hear Google is placing increasing value on the domain and less on the page authority.
Intermediate & Advanced SEO | | knielsen
Any insight appreciated thank you0 -
Google Experiment
Hi there, are there any implications for organic seo for implementing google experiment. Where you send users to a new page?
Intermediate & Advanced SEO | | pauledwards0