My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
-
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
-
Have you changed your homepage drastically lately? When you look at the text version of your Google cache there are only a couple links in there. Did other links exist previously? Perhaps Google is taking them out because they no longer exist on your home page?
-
No. We have been consolidating international sites and condensing them by language and not GEO location. I was worried that his would become a issue.
-
I'm betting you got hit with Google's Freshness update. Are those minor site links new pages? Do they get updated with content more frequently than your major ones?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
How did my dev site end up in the search results?
We use a subdomain for our dev site. I never thought anything of it because the only way you can reach the dev site is through a vpn. Google has somehow indexed it. Any ideas on how that happened? I am adding the noindex tag, should I used canonical? Or is there anything else you can think of?
Intermediate & Advanced SEO | | EcommerceSite0 -
Problem with description on Google search results.
A few months ago I changed the description of one of the pages on my site.
Intermediate & Advanced SEO | | Tiedemann_Anselm
And I noticed that Google does not display the entire description of his search results. Description page is: "Get yourself a personalized name necklace, we offer a huge range of silver, gold and gold plated name necklaces." And Google only shows this line: "Get yourself a personalized name necklace, we offer a huge ... " Did someone have an idea why is that? 2EPSLGX.png0 -
Do search engines only count links that have google analytics?
I am reading a thread right now and I came across this statement: Search engines can view clicks only if websites have Google analytics or some toolbar installed. Obviously that's not the case with over 50% of the websites. That's why I don't agree with your comment. True or False?
Intermediate & Advanced SEO | | SEODinosaur0 -
Working out exactly how Google is crawling my site if I have loooots of pages
I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated Cheers
Intermediate & Advanced SEO | | soeren.hofmayer0 -
Site Wide Internal Navigation links
Hello all, All our category pages www.pitchcare.com/shop are linked to from every product page via the sidebar navigation. Which results in every category page having over 1700 links with the same anchor text. I have noticed that the category pages dont appear to be ranked when they most definately should be. For example http://www.pitchcare.com/shop/moss-control/index.html is not ranked for the term "moss control" instead another of our deeper pages is ranked on page 1. Reading a previous SEO MOZ article · Excessive Internal Anchor Text Linking / Manipulation Can Trip An Automated Penalty on Google
Intermediate & Advanced SEO | | toddyC
I recently had my second run-in with a penalty at Google that appears to punish sites for excessive internal linking with "optimized" (or "keyword stuffed anchor text") links. When the links were removed (in both cases, they were found in the footer of the website sitewide), the rankings were restored immediately following Google's next crawl, indicating a fully automated filter (rather than a manual penalty requiring a re-consideration request). Do you think we may have triggered a penalty? If so what would be the best way to tackle this? Could we add no follows on the product pages? Cheers Todd0