Can anyone help me figure out these sitelinks?
-
My company is Squatty Potty (yes, of magic unicorn fame) and I recently redid our website's navigation. We're overhauling it currently to rebuild the whole thing, but what is there should give a good idea of site hierarchy to Google I would think.
The funny thing is, when you Google [squatty potty website] we do have sitelinks. But when you Google just [squatty potty] we don't. Any ideas on why sitelinks would appear on one search but not the other? I see they appear with [squatty potty logo] as well. I can't figure out how to get them to appear for my brand name search, any help appreciated!
-
So those sitelinks you see for [squatty potty website] are what I want showing up for [squatty potty] as well. I see a couple text links is all, not expanded full sitelinks.
-
Oh, I'm actually only looking at U.S. results of our .com. International sites are controlled by other distributors.
-
Actually i have different results ...
-
For what it's worth, I'm seeing the opposite, see attached screenshots.
Searching from the UK.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Stuck on the 2nd page of google! Help
I run a McAfee Technical Support website. I has been 2.3 months since I have been practicing seo on it. It was slick until it appeared on the second page of google. But now it doesnt rank up as it's frozen. Can i get any advices and suggestions for my website to break the 2nd page cage. My website:-** mcafee.com/activate**
Intermediate & Advanced SEO | | six_figures0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
What can we do to optimize / be mobile-friendly for PDFs?
I'm getting a "Your page is not mobile-friendly." notice in the SERPs for all of our PDFs. I check the pdf on the phone and it appears just fine. rFtLq
Intermediate & Advanced SEO | | johnnybgunn0 -
Silo not ranking for main silo page - what can I do?
Hi everyone, I set up a silo for my page http://werkzeug-kasten.com/ . Unfortunately only the silos inner pages rank very good. These are for example http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/keyword-analyse/ for "Keywordanalyse SEO Freiburg" <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/onpage-seo/</a> for "Onpage SEO Freiburg" ... but the silos main page <a>http://werkzeug-kasten.com/suchmaschinenoptimierung-seo-freiburg/</a> does not rank for "SEO Freiburg". Do you have any idea why that might be? Cheers, Marc
Intermediate & Advanced SEO | | RWW0 -
Can 410 links trigger a penalty?
Hi! This is a follow on question from my other post - http://moz.com/community/q/site-dropped-after-recovery. As mentioned there, I've ad a manual penalty revoked for http://www.newyoubootcamp.com/. This came after the forum was hacked and some poor quality SEO was done. We've managed to clean a large amount of links, but ones such as http://about1.typepad.com/blog/2014/04/tweetdeck-to-launch-as-html5-web-app-now-accepting-beta-testers.html (anchor is "microsoft") are still being found and indexed. My question is that although the forum is now 410'd, can these junk links still be causing any harm? A huge amount have been disavowed, and many others taken down after a manual outreach campaign, but still others are appearing. The site is performing poorly in search despite having a much better domain authority, driven by largely great links from national newspapers, than its competitors, as well as solid user metrics such as a bounce rate of 30% and few on-site issues. This makes me think it must be the link profile. Any advice would be much appreciated. S
Intermediate & Advanced SEO | | Blink-SEO0 -
Can Google Read Text in Carousel
so what is the best practice for getting Google to be able to read text that populates via JQuery in a carousel. If the text is originally display none, is Google going to be able to crawl it? Are there any limits to what Google can crawl when it comes to JavaScript and text? Or is it always better just to hardcopy the text on the page source?
Intermediate & Advanced SEO | | imageworks-2612900 -
Why would sitelinks disappear
For over the past month my site has received sitelinks from Google and they have been really helping out. All of a sudden this morning they are gone. We 301'ed and redirected (WBT) another one of our other sites to this one on monday. Could it be Google is just trying to figure out what we are doing? Has anyone else seen sitelinks come and go on their site without any other redirects? I am assuming it is because of my sites combining but wanted to hear if anyone else saw sitelinks come and go on their own. Thanks
Intermediate & Advanced SEO | | GeorgeLaRochelle0 -
Corporate pages and SEO help
We own and operate more than two dozen educational related sites. The business team is attempting to standardize some parts of our site hierarchy so that our sitemap.php, about.php, privacy.php and contact.php are all at the root directory. Our sitemap.php is generated by our sitemap.xml files, which are generated from our URLlist.txt files. I need to provide some feedback on this initiative. I'm worried about adding more stand-alone pages to our root directory and as part of a separate optimization in the future I was planning to suggest we group the "privacy", "about" and "contact" pages in a separate folder. We generally try to put our most important pages/directories for SEO in the root as our homepages pass a lot of link juice and have high authority. We do not invest SEO time into optimizing these pages as they're not pages we're trying to rank for, and I've already been looking into even no-following all links to them from our footer, sitemap, etc. I know that adding these "corporate" pages to a site are usually a standard part of the design process but is there any SEO benefit to having them at the root? And along the same lines, is there any SEO harm to having unimportant pages at the root? What do you guys think out there in Moz land?
Intermediate & Advanced SEO | | Eric_edvisors0