Parallax, SEO, and Duplicate Content
-
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize.
We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function.
Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang.
www.example.com/About < This is its own page
www.example.com/about/#/history < This is a subpage that you scroll to on the About page
We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content?
Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user.
I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
-
Hi Paul,
I totally agree with you. Development is outgrowing crawlers but then again this has always been true. Designing and programing for crawlers is something SEOers do but not programers. The thing is that clients want traffic and conversiones and cool technology. However if you only do cool technology without accomplishing business objectives, the project will not be consider successful in the clients eyes...just my 2 cents..
Regarding doing SEO and Parallax Scrolling, I think these two sites accomplished it nicely. I have not found any others. Both are responsive which is also a must.
Kickpoint.ca accomplished telling a story through its graphics and the site is light and versatile. However its onsite SEO could be improved with a little effort.
Posicionamiento Web accomplished great onsite SEO but poor "story telling parallax scrolling" effects. The site is heavy and not as versatile as kickpoint's.
One option is to do parallax scrolling on the home and regular internal pages. This makes the site light.
Good Luck.
Carla
-
Well thats no good. I guess we have to be really careful. Unfortunately this is one area where I think development has outgrown the ability for crawlers to determine what is happening. We aren't trying to do anything malicious. We just want to create a good engaging site. I'll keep trying and continue to post if I have more luck with our method.
-
Hi Paul,
I decided to run flowerbeauty.com through MOZ's software and here are the results. It does pick up stuff as duplicate content. See http://imgur.com/YEb6bmZ
Flowerbeauty is not SEO friendly.
[](<a href=)" target="_blank">a>
-
Hi Paul,
I think before you design your website, you might try creating a campaign for flowerbeauty.com to see if Moz says there are any onsite errors. If it passes Moz's scrutiny, then there is a good chance Google will see it the same way. I have a slot for a new campaign in case you don't have one. I would really like to get to the bottom of this as you can see.
Let me know if you want me run it through Moz to see if it passes onsite optimization. I forgot to mention that my site did pass Moz's onsite analysis. I never tried running the other 2 SEO Parallax Scrolling websites through Moz's software.
-
Thanks Carla for the reference. So if I'm understanding the way this example site works. Your pulling in the content on each "subpage" dynamically so that Google will never see the content twice when visiting the page?
I like the way this works, although I wish you didn't have to see a loading graphic when the page pulls in the content.
I agree that this method looks good, and I would love to get rid of the hashbangs but thats just part of what makes everything work in our current use case.
Certainly I think this site is a move in the right direction.
Thanks for the links and pointers
-
Hi Paul,
First of all congrats on the great new technique. I recently wrote an article about SEO Parallax Scrolling and Responsive websites. There is a website using the hashtag method that is semi SEO friendly here http://flowerbeauty.com/.However that being said
To answer your concer "but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content". I believe Google will see this as a multiple page site as long as you have different content on different URL's. Why don't you try adding parallax scrolling to each SEO url and have the scroll function take you to each URL like www.flowerbeauty.com did. Make sure to optimize your URL's as well.
For example
as you scroll it would take you too
www.example.com/About/optimized-URL.
I would get rid of the hashtags.
Here are some other SEO friendly parallax scrolling websites. http://www.pinterest.com/ecumbre/seo-and-parallax-scrolling/
Let me know if that helps.
Thanks Carla
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Html4 menu system which is seo-friendly while moving to html5
I have a complex site and very large site that we are moving to html5 as quickly as possible given our resources (long overdue) but I was wondering if anyone knew of a menuing system that would work on mobile that is seo-friendly in terms of do-follow and does not use javascript that the spiders often cannot read. We need code/css that works for both the menu and for select boxes. I know few write such code anymore, and the idea is dated, but it is a temporary stopgap while we move to HTML5 when such tools are available. Does any such code, free or commercial, exist anymore? Thank you in advance as this is very important in terms of not usuing too much mobile real estate with side menus... Best regards
Web Design | | gheh20130 -
Are provincial third-level domains bad for SEO?
My prospect's domain ends in ".on.ca" (Ontario, Canada). The structure of their site is companyname.on.ca (main page) and all other pages are sub-folders (companyname.on.ca/page-name-1 All pages are no more than two levels deep. I'm wondering if anyone knows if the provincial sub-domain (.on.ca) presents an SEO challenge?
Web Design | | 22Eighteen1 -
Non-wildcard SSL risky for SEO?
I have a potential client who doesn't seem to be using wildcard SSLs in a multi-site scenario (over 40 sites) - what I'm wondering is the scope of Google's inspection of a site's SSL in this case: https://www.domain.com (good to go) https://domain.com (certificate error) Will Googlebot/Google possibly consider the entire TLD insecure? Could the secured, www-version of the site end up with the "Site is not secure" message in the SERPs as well? Could this invisibly affect the client's rankings? PS: Yes, I know that the right thing to do is go wildcard, but I need an answer to this before recommending a large purchase to them.
Web Design | | scottclark0 -
How Progressive Enhancement Will be Helpful for SEO?
We have bundle of webpages where we load the content dynamically with the help of Ajax. Since we, need to implement Ajax crawl scheme for making Google to read those Ajax dynamic content we planned to go with hashbang URL's (!#) by creating HTMl snapshots. But last week Google withdrawn their support on crawling the Ajax crawling scheme we are planning to go with progressive enhancement approach as stated by Google in a press release. So, I just want to know what is meant by progressive enhancement and how we can implement in the case of webpages where we load the content dynamically with the help of Ajax? Please advice me on this.
Web Design | | Prabhu.Sundar1 -
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Web Design | | arithon0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Will my site structure provide decent SEO?
We have an ASP.NET MVC website with a view that can dynamically display each product we offer. The product name is hyphenated in the URL, and this is what we’re using to pull the product from the database. So an example URL would be: http://www.mysite.com/Products/Florida/Sample-Product-Name We have another view that dynamically lists the products offered for each state. This page would contain links to the URL for each product offered in that state. The URL for Florida would be: http://www.mysite.com/Products/Florida We want to make sure that when we enter a new product into the database, the product is indexed by Google the next time our site is crawled. I know that Google will crawl through the links in our website, so the new product should get indexed as long as we have a link to it. In this case, the link will be on the view that lists the products for the corresponding state. I have 2 questions: 1) Is my understanding correct that Google will index the product page as long as it can find a link to it somewhere in my site? 3) To get Google to index each URL for content that is generated dynamically from a database, is having links in my site for each URL the only way to do it? Is there something we can do with the site map? Thanks in advance everyone! -Alex
Web Design | | dbuckles0 -
Need advice on diplaying content for Search
Hi every body, I am doing landing page redesign(s). Does any body know or can refer a content carousel that can rotate video and pictures? The "site with images" search option result is a compelling reason to showcase pictures if your space competeiveness (showroom, merchandise, etc) can be improved with a strong image presence. here is my main landing page http://www.shearerpainting.com I know there is alot of stuff, and confusing call to action, but I am looking for strategies to clean it up, clear fous on action (get bid, learn more), but allow users to see that they can dig for more content.
Web Design | | johnshearer0