Parallax, SEO, and Duplicate Content
-
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize.
We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function.
Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang.
www.example.com/About < This is its own page
www.example.com/about/#/history < This is a subpage that you scroll to on the About page
We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content?
Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user.
I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
-
Hi Paul,
I totally agree with you. Development is outgrowing crawlers but then again this has always been true. Designing and programing for crawlers is something SEOers do but not programers. The thing is that clients want traffic and conversiones and cool technology. However if you only do cool technology without accomplishing business objectives, the project will not be consider successful in the clients eyes...just my 2 cents..
Regarding doing SEO and Parallax Scrolling, I think these two sites accomplished it nicely. I have not found any others. Both are responsive which is also a must.
Kickpoint.ca accomplished telling a story through its graphics and the site is light and versatile. However its onsite SEO could be improved with a little effort.
Posicionamiento Web accomplished great onsite SEO but poor "story telling parallax scrolling" effects. The site is heavy and not as versatile as kickpoint's.
One option is to do parallax scrolling on the home and regular internal pages. This makes the site light.
Good Luck.
Carla
-
Well thats no good. I guess we have to be really careful. Unfortunately this is one area where I think development has outgrown the ability for crawlers to determine what is happening. We aren't trying to do anything malicious. We just want to create a good engaging site. I'll keep trying and continue to post if I have more luck with our method.
-
Hi Paul,
I decided to run flowerbeauty.com through MOZ's software and here are the results. It does pick up stuff as duplicate content. See http://imgur.com/YEb6bmZ
Flowerbeauty is not SEO friendly.
[](<a href=)" target="_blank">a>
-
Hi Paul,
I think before you design your website, you might try creating a campaign for flowerbeauty.com to see if Moz says there are any onsite errors. If it passes Moz's scrutiny, then there is a good chance Google will see it the same way. I have a slot for a new campaign in case you don't have one. I would really like to get to the bottom of this as you can see.
Let me know if you want me run it through Moz to see if it passes onsite optimization. I forgot to mention that my site did pass Moz's onsite analysis. I never tried running the other 2 SEO Parallax Scrolling websites through Moz's software.
-
Thanks Carla for the reference. So if I'm understanding the way this example site works. Your pulling in the content on each "subpage" dynamically so that Google will never see the content twice when visiting the page?
I like the way this works, although I wish you didn't have to see a loading graphic when the page pulls in the content.
I agree that this method looks good, and I would love to get rid of the hashbangs but thats just part of what makes everything work in our current use case.
Certainly I think this site is a move in the right direction.
Thanks for the links and pointers
-
Hi Paul,
First of all congrats on the great new technique. I recently wrote an article about SEO Parallax Scrolling and Responsive websites. There is a website using the hashtag method that is semi SEO friendly here http://flowerbeauty.com/.However that being said
To answer your concer "but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content". I believe Google will see this as a multiple page site as long as you have different content on different URL's. Why don't you try adding parallax scrolling to each SEO url and have the scroll function take you to each URL like www.flowerbeauty.com did. Make sure to optimize your URL's as well.
For example
as you scroll it would take you too
www.example.com/About/optimized-URL.
I would get rid of the hashtags.
Here are some other SEO friendly parallax scrolling websites. http://www.pinterest.com/ecumbre/seo-and-parallax-scrolling/
Let me know if that helps.
Thanks Carla
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Progressive Enhancement Will be Helpful for SEO?
We have bundle of webpages where we load the content dynamically with the help of Ajax. Since we, need to implement Ajax crawl scheme for making Google to read those Ajax dynamic content we planned to go with hashbang URL's (!#) by creating HTMl snapshots. But last week Google withdrawn their support on crawling the Ajax crawling scheme we are planning to go with progressive enhancement approach as stated by Google in a press release. So, I just want to know what is meant by progressive enhancement and how we can implement in the case of webpages where we load the content dynamically with the help of Ajax? Please advice me on this.
Web Design | | Prabhu.Sundar1 -
Joomla! Site Returning 12000+ Duplicate Content Errors! W Image
(I do award "Good Answer" and "thumbs up" to responses as earned) I have tried to ask this question previously (maybe not correctly). I have a client that I am doing the on and offsite optimization and the MOZ report is kicking back major errors. I have examples below. They all seem to relate directly to rokecwid and ECWID. Is there ANY solution to fix this? Is this hurting the rankings Since I didn't build the site, I am having to tell the website company what to do when I need changes made to code, etc... I am also not very proficient with Joomla! and my web engineer is one of those closet coders (the best kind to have) and doesn't communicate in a way that a "layman" could understand. He pointed out several issues with the HTML but I don't think that is related to this below. Can anyone tell me what to tell the web company that built this site to get rid of these errors? A very small sample of the urls w errors:
Web Design | | Atlanta-SMO
http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560097
1 14 1 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560098
1 1 0 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560099
1 14 1 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560100
1 14 1 SEOMOZErrors_zps3a1ce2a2.png0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
‘80-90% of SEO already done for you in Wordpress’ Am I missing something?
Hi there, I’m looking for some feedback on a statement made on my Facebook Page re Wordpress and SEO. Please understand I wouldn’t class myself as an expert but I am competent and achieve satisfactory results for clients, more so since becoming a SEOmoz Pro user, I’ve just had some great results for a client using SEOmoz guidelines in ‘On Page SEO Reports’ thank you very much! A comment however made on my FB page has got my interest…. “as you (kn)no(w) google loves WP and will get listed quicker as 80 to 90% of your SEO is already done” Does Wordpress (or Joomla for that matter mentioned in the same conversation) have some SEO advantages that Google loves as the poster would have me believe, can I save time and effort working in word press from an SEO point of view? I use the age old techniques of targeting key phrases and words and distributing them accordingly. Creating internal link structures with ‘key worded anchor text’ etc before embarking on any off page SEO. Do any of you vastly experienced (in comparison to me) SEO folk have any insight into what this statement refers to? I did not gather any references to SEO advantages in Wordpress or Joomla in the Enge and Fishkin et al book The Art of SEO, or any of the other books I’ve read, to develop my knowledge on SEO for the benefit of my clients and of course my pocket. J
Web Design | | JemRobinson0 -
Question re. crawlable textual content
I have a client who is struggling to fit crawlable textual content on their pages. I'm wondering if we can add a "Learn More..." feature that works as a mouse over pop up. When a page visitor runs their curser over the link or button, a window bubble pops up and textual content about the page will show. Not knowing much about code, can text in this format be crawlable by search engines and count as unique and relevant content? Thanks, Dino
Web Design | | Dino640 -
What do web designers consider to be SEO
I'm putting together an article about Web Designers and SEO. The basic crux is that most designers will at most (if you are lucky) add in title tags, maybe pretty URL's and HTML links and call it SEO friendly. (lot's who don't probably but also lot's who are even worse) Of course I want it to be a bit controversial so please feel free to let rip. My argument is clients who have previously had site's that have had "SEO on them", know think that all SEO is a waste of time as their previous experience didn't produce the traffic and sales they were expecting. I don't know many designers who independently consider a site marketing strategy and how a site is going to generate links in the long term. I'm sure most of us have read this post on SEO responsiblities but becasue the first port of call for most business owners are the web designers, our offering is undermined by their misunderstandings and in some cases mis-sellings. So under the working title, "Why Web Designers Are Bad For SEO", any care to share some imput.
Web Design | | FDC1 -
Using "#" anchors to display different content
If I have a page that has an area on the page that acts like a widget and has three different tabs. These tabs provide 3 different types of information relevant to the page subject matter. By default when someone goes to the page one of the tabs is showing but you have to click on the others to see the info on them. Is it OK to use domain.com/topic#TAB1, domain.com/topic#TAB2, domain.com/topic#TAB3 to create shortcut links so that people can land on the page and have that predetermined tab showing. I'm wondering what search engines might think. Essentially all the content of all three tabs is there for people to see but they'd have to click to see the other tabs. I don't consider the content to be hidden. But I'd like to hear people's thoughts.
Web Design | | Business.com0