Converse.com - flash and html version of site... bad idea?
-
I have a questions regarding Converse.com. I realize this ecommerce site is needs a lot of seo help. There’s plenty of obvious low hanging seo fruit. On a high level, I see a very large SEO issue with the site architecture.
The site is a full page flash experience that uses a # in the URL. The search engines pretty much see every flash page as the home page. To help with issue a HTML version of the site was created. Google crawls the
Home Page - Converse.com
Marimekko category page (flash version)
http://www.converse.com/#/products/featured/marimekko
Marimekko category page (html version, need to have flash disabled)
http://www.converse.com/products/featured/marimekko
Here is the example of the issue. This site has a great post featuring Helen Marimekko shoes
http://www.coolmompicks.com/2011/03/finnish_foot_prints.php
The post links to the flash Marimekko catagory page (http://www.converse.com/#/products/featured/marimekko) as I would expect (ninety something percent of visitors to converse.com have the required flash plug in). So the flash page is getting the link back juice. But the flash page is invisible to google.
When I search for “converse marimekko” in google, the marimekko landing page is not in the top 500 results. So I then searched for “converse.com marimekko” and see the HTML version of the landing page listed as the 4<sup>th</sup> organic result. The result has the html version of the page. When I click the link I get redirected to the flash Marimekko category page but if I do not have flash I go to the html category page.
-----
Marimekko - Converse
All Star Marimekko Price: $85, Jack Purcell Helen Marimekko Price: $75 ...
www.converse.com/products/featured/marimekko - Cached
So my issues are…
Is converse skating on thin SEO ice by having a HTML and flash version of their site/product pages?
Do you think it’s a huge drag on seo rankings to have a large % of back links linking to flash pages when google is crawling the html pages?
Any recommendations on to what to do about this?
Thanks,
SEOsurfer
-
Tom,
Thank you for taking the time to look at the site and giving a detailed response. I’ve been doing some research myself and my findings mirror your assessment. Thank you for recommended action items too. Converse uses http://www.asual.com/swfaddress/ which is a good site experience but as you pointed out not so hot for SEO.
--SEOsurfer
-
Great question!
Firstly - unfortunately, Steve's suggestion isn't going to be viable for you. The # portion of the URL is not available to your code server-side, so you won't be able to determine where the rel canonical should point.
Furthermore, if they are committed to keeping the flash for now, and all as a single unit so one URL (the homepage), then you are going to have to accept that some juice intended for subpages is going to go to the homepage. You cannot do anything about that aspect, so you need to focus on the rest of the problem. However, whilst far from ideal, at least the juice is hitting the site somehow.
So… what to do?
Firstly, I'd start getting into the mindset of thinking in terms of the HTML site as the main/canonical site, and the Flash site as the 'enhanced experience' version. In this way, the HTML version is going to be the version that should be crawled by Google, and should be linked to.
Actions:
- Setup detection for mobile user-agents (out of preference I'd say all, but at least those known not to support flash, such as iPhone/iPad) and search engine bots, and ensure they get served the HTML version. Currently your homepage requires a click through on iPad offering an impossible Flash download, why not serve them the HTML page off the bat.
Is this cloaking? No! The HTML version is the main version, remember? It's no more cloaking than if you detected the user agent and then chose to serve the Flash version to Googlebot.
I actually discussed this with Jane Copeland at the fantastic Distilled link building event a couple of weeks back, and she agreed with me and said if it would stand up to a manual inspection then it is the right course of action.
-
Get all links in articles, press releases, directories or whatever else that are linking to specific pages and are originating from in house (or any source you have control over) to link to the HTML pages.
-
If the user arrives, has Flash and has arrived to an HTML link, you can now redirect to the Flash link for that page so they get the 'enhanced experience'. Don't use a 301 redirect -- remember the HTML version is the main version!
-
If the user arrives via a Flash link, but doesn't have Flash, but does have javascript you can detect the # variable and redirect them to the HTML page to help them along.
-
Educate the relevant stakeholders regarding point 2. I see you have a 'flashmode=0' option, tell them about this and how to use it get the URLs they need.
So where does this leave us?
-
The search engines can crawl all your lovely content, and they can ignore the flash version completely.
-
You are getting inbound links to specific pages. These pages have their own titles and meta descriptions… and content! Because they are the real site!
-
Users with Flash arriving via these links are landing on the correct Flash page of the site and are experiencing the rich site that you want them to.
-
Users arriving without Flash are getting the correct page if they arrive via an HTML URL. If they arrive via a Flash url then they get the correct page if they have javascript on (e.g iPad users), or they get the fallback of the homepage (rare).
I had a client with an almost identical situation, and I rolled out an almost identical solution to this, and they got crawled very quickly, shot up in Google and have stayed there for months.
Hope it helps. Let us know how you get on!
-
It's definitely a drag to have your links diluted between 2 versions of the site. There are a few solutions you can use, but the easiest would probably be to start using the rel=canonical tag on the flash version which points back to the same or similar page on the HTML site. That way, the engines know that the version you want indexed is the HTML version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
Site redesign makes Moz Site Crawl go haywire
I work for an agency. Recently, one of our clients decided to do a complete site redesign without giving us notice. Shortly after this happened, Moz Site Crawl reported a massive spike of issues, including but not limited to 4xx errors. However, in the weeks that followed, it seemed these 4xx errors would disappear and then a large number of new ones would appear afterward, which makes me think they're phantom errors (and looking at the referring URLs, I suspect as much because I can't find the offending URLs). Is there any reason why this would happen? Like, something wrong with the sitemap or robots.txt?
Technical SEO | | YYSeanBrady1 -
Recently migrated to https version of volusion site. 301 redirect link chain question
I recently migrated to a https version of a volusion site. They have some type of internal 301 redirect method to accommodate for the entire site. I have also used the 301 redirect manager to redirect categories and pages which I have changed. The question is if I have changed a page internally in the redirect manager from say source. /bluewidget to say. target. /superbluewidget is it wiser or even possible to do it this way to reduce the redirect chain from 3 to 2 steps source. /bluewidget to. target https://www.example/superbluewidget can a relative link be targeted to a full url to reduce steps in a 301 redirect link chain. Thanks
Technical SEO | | mrkingsley0 -
Launch of improved site
Hi, Just want to ask you guys if i have missed something in my planning. We have done a migration from Ithemes Exchange to woocommerce. The complete migration are done on our dev server. It has an exakt setup as our live one. My plan is to change our live version with a backup from our migrated and finished site from our dev site. All of our product links will be intact with accept from some that we have combined in to new ones, the ones that are changed has been redirected with a 301. Will this way of launching our site effect our ranking/seo in some way? Thankful for any thoughts about this one! // Jonas
Technical SEO | | knubbz0 -
What is the value of having an HTML sitemap on site?
For years now whenever we recreate a site we always set up both an xml sitemap and an html sitemap page. Stupid question maybe - but what is the value to having an html sitemap on site?
Technical SEO | | Pete40 -
I noticed all my SEOed sites are getting attacked constantly by viruses. I do wordpress sites. Does anyone have a good recommendation to protect my clients sites? thanks
We have tried all different kinds of security plugins but none seem to work long term.
Technical SEO | | Carla_Dawson0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0 -
My site cannot be found by google at all
I don't know why but our company site can not be found by google at all. I have submitted to google webmaster, have social media point to, etc, Is there any reason for this? url for our website is www.bistosamerica.com Thank you
Technical SEO | | BistosAmerica0