How is this site doing this?
-
It shows a splash / promotion page yet you check the cache and it's the real homepage, they are doing this so they don't lose rankings but how are they redirecting users to that but Google is caching the real homepage?
is it friendly?
thanks!!
-
It's embedded CSS, correct.
Thanks!
-
View the source code when on the splash page and you will see that their entire site is there, this is what a search bot is seeing. View in browser and this triggers a splash screen overlay - really, the full website is there, you just can't see it.
Perfectly safe, perfectly legit.
-
Hi ,
I think that the web server checks the user agent of the requesting client and if its a web crawler like the google bot - shows the normal home page.Its its a normal browser user agent , like firefox - it will show the splash screen.
In theory this should be ok for SEO as google bot only sees the normal home page. But in practice I think its an accident waiting to happen.If google change the user agent string of their crawler bots - then they could be shown the splash screen. Perhaps there are other methods the search engines use to check things like this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking subdomains without blocking sites...
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com. Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load? This is a simplification of a problem I am running across. Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites). Thoughts?
Technical SEO | | SL_SEM1 -
Multilingual blogs and site structure
Hi everyone, I have a question about multilingual blogs and site structure. Right now, we have the typical subfolder localization structure. ex: domain.com/page (english site) domain.com/ja/page (japanese site) However, the blog is a slightly more complicated. We'd like to have english posts available in other languages (as many of our users are bilinguals). The current structure suggests we use a typical domain.com/blog or domain.com/ja/blog format, but we have issues if a Japanese (logged in) user wants to view an English page. domain.com/blog/article would redirect them to domain.com/ja/blog/article thus 404-ing the user if the post doesn't exist in the alternate language. One suggestion (that I have seen on sites such as etsy/spotify is to add a /en/ to the blog area: ex domain.com/en/blog domain.com/ja/blog Would this be the correct way to avoid this issue? I know we could technically work around the 404 issue, but I don't want to create duplicate posts in /ja/ that are in English or visa versa. Would it affect the rest of the site if we use a /en/ subfolder just for the blog? Another option is to use: domain.com/blog/en domain.com/blog/ja but I'm not sure if this alternative is better. Any help would be appreciated!
Technical SEO | | Seiyav0 -
Site Map Problems or Are They?
According to webmaster tools my Sitemap contains urls which are blocked by robots.txt Our site map is generically generated and encompasses all web pages, whether I have excluded them using the robots.txt file As far as I am aware this has never been an issue until recently. Is this hurting my rankings and how do I fix it? Secondly, webmaster tools says there is over 5,000 error/warnings on my site map. But site map is only 1,400 or so pages submitted. How do I see what is going on?
Technical SEO | | Professor0 -
Site links show spam
Hi folks, I'm working on a website that runs on WordPress and was not updated by the owner, this has resulted in a malware injection and now when you search the companies name in Google, the site links appear with words like Viagra, et al. I've seen this a number of times, so I went through the code and have removed all the malware. I presume I now have to wait for Google to recrawl the website and update the site links? Is there anything else I should be doing to speed up the process? Thank you 🙂
Technical SEO | | ChristopherM0 -
Should Canonical be used if your site does not have any duplicate
Should canonical be used site wide even if my site is solid no duplicate content is generated. please explain your answer
Technical SEO | | ciznerguy0 -
Replacing a site map
We are in the process of changing our folder/url structure. Currently we have about 5 sitemaps submitted to Google. How is it best to deal with these site maps in terms of either (a) replacing the old URLs with the new ones in the site map and (b) what affect should we have if we removed the site map submission from the Google Webmaster Tools console. Basically we have in the region of 20,000 urls to redirect to the new format, and to update in the site map.
Technical SEO | | NeilTompkins0 -
How to setup tumblr blog.site.com to give juice to site.com
Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.
Technical SEO | | oznappies0