Links under Meta Description when performing a search
-
Doing research for clients, I have came across seeing sites displaying hyperlinks underneath their own meta description. keywords that I have googled that result with hyperlinks displaying under meta descriptions:
Google'd:
iacquire (brand)
bmw wheels (Beyern Wheels, position 1)
aftermarket bmw wheels (MMR Wheels, position 2)
These companys have hyperlinks underneath their descriptions. Anyone have any ideas why this happens or how it happens?
-
Just to add to Mihai's response, you can always delete the sitelinks from the GWT if there are some irrelevant links showing up for your website.
-
Yep!Thanks.
-
Hey,
These are actually called one-line sitelinks (also known as mini sitelinks), and have been documented on the Google Webmaster Blog here: http://googlewebmastercentral.blogspot.ro/2009/04/one-line-sitelinks.html
Basically, they're a more condensed (up to 4) form of the normal sitelinks, and can appear for results other than the first result. Google uses them to show users other pages on that site related to their search term. Like the normal sitelinks, you can't really control when they show up.
Hope this helps, cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages being flagged in Search Console as having a "no-index" tag, do not have a meta robots tag??
Hi, I am running a technical audit on a site which is causing me a few issues. The site is small and awkwardly built using lots of JS, animations and dynamic URL extensions (bit of a nightmare). I can see that it has only 5 pages being indexed in Google despite having over 25 pages submitted to Google via the sitemap in Search Console. The beta Search Console is telling me that there are 23 Urls marked with a 'noindex' tag, however when i go to view the page source and check the code of these pages, there are no meta robots tags at all - I have also checked the robots.txt file. Also, both Screaming Frog and Deep Crawl tools are failing to pick up these urls so i am a bit of a loss about how to find out whats going on. Inevitably i believe the creative agency who built the site had no idea about general website best practice, and that the dynamic url extensions may have something to do with the no-indexing. Any advice on this would be really appreciated. Are there any other ways of no-indexing pages which the dev / creative team might have implemented by accident? - What am i missing here? Thanks,
Technical SEO | | NickG-1230 -
Search Console Indexed Page Count vs Site:Search Operator page count
We launched a new site and Google Search Console is showing 39 pages have been indexed. When I perform a Site:myurl.com search I see over 100 pages that appear to be indexed. Which is correct and why is there a discrepancy? Also, Search Console Page Index count started at 39 pages on 5/21 and has not increased even though we have hundreds of pages to index. But I do see more results each week from Site:psglearning.com My site is https://wwww.psglearning.com
Technical SEO | | pdowling0 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
All META descriptions gone
Hi there, Since almost a week now, all of my optmized META descriptions has been gone in Google. The last few years Google has always shown all of the optimized META descriptions. My website is an ecommerce site (phone accessories) and all pages have its own unique content (url, text, title, description) and score well in Google. The META descriptions are created by using a template like this: At [brandname] you find lots of [variable category product] * USP 1 * USP 2 * USP 3 All META descriptions differ from each other only by the variable category product. Something tells me this is an effect of the Panda 4.0 update. I tested with a category page by replacing the META description for a 100% unique one. Then I asked Google (via Webmaster tools) to reindex the page. Today the new description got indexed. This means uniqueness is important. My question is: how do I get the optimized META descriptions back? Creating real unique descriptions (means not using a template) for every page is very hard for a webhop since all category pages have the same message to tell (only difference is the type of product), I want to use USP's, and META descriptions of all productpages have been lost too (over 15000 different products). Please help!
Technical SEO | | MarcelMoz
Thanks in advance. Marcel0 -
Updating/chaning title tags & meta descriptions
Hi there, Can altering title tags and meta descriptions too often have a negative impact on page ranking? Thank you!
Technical SEO | | ZAG0 -
Self-referencing links
I personally think that self-referencing links are silly. It's blatantly easy for Google to tell and my instinct says that the link juice for this would simply evaporate rather than passing back to itself. Does anyone have information backing me up from an authoritative source? I can't find any info about this linked to Matt Cutts, Rand or any of those I look up to.
Technical SEO | | IPROdigital0 -
I want my Meta Description re-indexed fast!
We have an old meta description that advertises an old offer (FREE X if you Buy Y) that we are no longer running on the site. I changed the meta description, now what is the fastest way I can get Google to update their SERP with the new description?
Technical SEO | | pbhatt0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0