Parallax, SEO, and Duplicate Content
-
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize.
We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function.
Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang.
www.example.com/About < This is its own page
www.example.com/about/#/history < This is a subpage that you scroll to on the About page
We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content?
Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user.
I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
-
Hi Paul,
I totally agree with you. Development is outgrowing crawlers but then again this has always been true. Designing and programing for crawlers is something SEOers do but not programers. The thing is that clients want traffic and conversiones and cool technology. However if you only do cool technology without accomplishing business objectives, the project will not be consider successful in the clients eyes...just my 2 cents..
Regarding doing SEO and Parallax Scrolling, I think these two sites accomplished it nicely. I have not found any others. Both are responsive which is also a must.
Kickpoint.ca accomplished telling a story through its graphics and the site is light and versatile. However its onsite SEO could be improved with a little effort.
Posicionamiento Web accomplished great onsite SEO but poor "story telling parallax scrolling" effects. The site is heavy and not as versatile as kickpoint's.
One option is to do parallax scrolling on the home and regular internal pages. This makes the site light.
Good Luck.
Carla
-
Well thats no good. I guess we have to be really careful. Unfortunately this is one area where I think development has outgrown the ability for crawlers to determine what is happening. We aren't trying to do anything malicious. We just want to create a good engaging site. I'll keep trying and continue to post if I have more luck with our method.
-
Hi Paul,
I decided to run flowerbeauty.com through MOZ's software and here are the results. It does pick up stuff as duplicate content. See http://imgur.com/YEb6bmZ
Flowerbeauty is not SEO friendly.
[](<a href=)" target="_blank">a>
-
Hi Paul,
I think before you design your website, you might try creating a campaign for flowerbeauty.com to see if Moz says there are any onsite errors. If it passes Moz's scrutiny, then there is a good chance Google will see it the same way. I have a slot for a new campaign in case you don't have one. I would really like to get to the bottom of this as you can see.
Let me know if you want me run it through Moz to see if it passes onsite optimization. I forgot to mention that my site did pass Moz's onsite analysis. I never tried running the other 2 SEO Parallax Scrolling websites through Moz's software.
-
Thanks Carla for the reference. So if I'm understanding the way this example site works. Your pulling in the content on each "subpage" dynamically so that Google will never see the content twice when visiting the page?
I like the way this works, although I wish you didn't have to see a loading graphic when the page pulls in the content.
I agree that this method looks good, and I would love to get rid of the hashbangs but thats just part of what makes everything work in our current use case.
Certainly I think this site is a move in the right direction.
Thanks for the links and pointers
-
Hi Paul,
First of all congrats on the great new technique. I recently wrote an article about SEO Parallax Scrolling and Responsive websites. There is a website using the hashtag method that is semi SEO friendly here http://flowerbeauty.com/.However that being said
To answer your concer "but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content". I believe Google will see this as a multiple page site as long as you have different content on different URL's. Why don't you try adding parallax scrolling to each SEO url and have the scroll function take you to each URL like www.flowerbeauty.com did. Make sure to optimize your URL's as well.
For example
as you scroll it would take you too
www.example.com/About/optimized-URL.
I would get rid of the hashtags.
Here are some other SEO friendly parallax scrolling websites. http://www.pinterest.com/ecumbre/seo-and-parallax-scrolling/
Let me know if that helps.
Thanks Carla
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Woocommerce SEO and Product attributes
Hi friends! I have a question that is advanced Woocommerce and seo-related.
Web Design | | JustinMurray
I'm seeing http://www.mywebsitex.com/pa_keyword/indexed in Google, but it cannot be properly optimized, and I would prefer to have a WordPress Page indexed for that keyword instead, which also lists those products and can be fully seo optimized. Woocommerce SEO plugin by Yoast lacks documentation and I have no clue if that would even fix this. I do have the Taxonomy (pa_keyword) set to not include these in the sitemap, but there doesn't seem to be a way to noindex/nofollow product attributes.
1. How can I best accomplish this?
2. Why are product attributes indexed by default?0 -
SEO strategy for UK / US websites
Hi, We currently have a UK-focused site on www.palmatin.com ; We're now targeting the North American market as well, but the contents of the site need to be different from UK. One option was to create another domain for the NA market but I assume it would be easier to rank with palmatin.com though. What would you suggest to do, if a company is targeting two different countries in the same language? thanks, jaan
Web Design | | JaanMSonberg0 -
Can multiple domains compete with one another if they have the same/similar content?
Could an ecommerce site with a .co.nz nd .com.au domain compete with one another and hard organic rankings if the content of the pages are the same? All the links would be identical (apart from .co.nz or .com.au) the product descriptions, pages titles etc... would all be identical or similar (our page titles are ever so slightly different). Could this be hurting us? Thanks in advance ^ Paul
Web Design | | kevinliao0 -
How to handle International Duplicated Content?
Hi, We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs? We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves... Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all? Alex
Web Design | | WebmasterAlex0 -
How To Avoid Duplicate Content
We are an eCommerce site for autoparts. It is basically impossible to avoid duplicate content, and I think we are getting penalized by Google for it. Here is why it is impossible. Let's say I sell a steering rack for a 2000 Honda Accord. I need an SEO rich page for 2000 Honda Accord Steering Rack. I sell steering racks for more than 25 years of Honda Accords. I can try and make the copy different but there is no way to spin the copy that many times and make it seem like it is not duplicate copy. This even gets more complicated because I sell hundreds of parts for each year of a Honda Accord, plus a lot of times you even have to go down to the engine size of the car for the right part. I can't use a redirect, ie 301 redirect because they are not the same pages. One is for a 2000 Honda Accord and the other a 2001 Honda Accord, and so on. Is their a redirect out there that I do not know about that would help me out in this case? Also, if their is no way around this and I am getting penalized would it be better to eliminate all these pages, possibly losing my ability to rank high on searches such as "2000 Honda Accord Steering Rack," and just replace with a page that has a Year Make Model, and Part dropdown which just takes the customer a checkout page?
Web Design | | joebuilder0 -
To many scripts in my homepage. This is a problem in SEO?
I adding a lot of new features to my website: JS animated, menus, google translate, alexa counter, google analytics, salesforce, and so on. My website is full of scripts and im worry about the SEO. Is that an issue?
Web Design | | Naghirniac0 -
SEO list for creating the *perfect* website
If you could build your website from scratch and have your developers do anything you want (within reason), what list of SEO requirements would you send them? Does anyone know of any good articles on the perfect SEO wish list? Happy Holidays!
Web Design | | MirandaP1 -
Real Estate and Duplicate Content
Currently we use an MLS which is an iFrame of property listings. We plan to pay an extra fee and have the crawlable version. But one problem is that many real estate firms have access to the same data, which makes our content duplicate of theirs. Is there any way around this ? Thanks
Web Design | | SGMan0