Converse.com - flash and html version of site... bad idea?
-
I have a questions regarding Converse.com. I realize this ecommerce site is needs a lot of seo help. There’s plenty of obvious low hanging seo fruit. On a high level, I see a very large SEO issue with the site architecture.
The site is a full page flash experience that uses a # in the URL. The search engines pretty much see every flash page as the home page. To help with issue a HTML version of the site was created. Google crawls the
Home Page - Converse.com
Marimekko category page (flash version)
http://www.converse.com/#/products/featured/marimekko
Marimekko category page (html version, need to have flash disabled)
http://www.converse.com/products/featured/marimekko
Here is the example of the issue. This site has a great post featuring Helen Marimekko shoes
http://www.coolmompicks.com/2011/03/finnish_foot_prints.php
The post links to the flash Marimekko catagory page (http://www.converse.com/#/products/featured/marimekko) as I would expect (ninety something percent of visitors to converse.com have the required flash plug in). So the flash page is getting the link back juice. But the flash page is invisible to google.
When I search for “converse marimekko” in google, the marimekko landing page is not in the top 500 results. So I then searched for “converse.com marimekko” and see the HTML version of the landing page listed as the 4<sup>th</sup> organic result. The result has the html version of the page. When I click the link I get redirected to the flash Marimekko category page but if I do not have flash I go to the html category page.
-----
Marimekko - Converse
All Star Marimekko Price: $85, Jack Purcell Helen Marimekko Price: $75 ...
www.converse.com/products/featured/marimekko - Cached
So my issues are…
Is converse skating on thin SEO ice by having a HTML and flash version of their site/product pages?
Do you think it’s a huge drag on seo rankings to have a large % of back links linking to flash pages when google is crawling the html pages?
Any recommendations on to what to do about this?
Thanks,
SEOsurfer
-
Tom,
Thank you for taking the time to look at the site and giving a detailed response. I’ve been doing some research myself and my findings mirror your assessment. Thank you for recommended action items too. Converse uses http://www.asual.com/swfaddress/ which is a good site experience but as you pointed out not so hot for SEO.
--SEOsurfer
-
Great question!
Firstly - unfortunately, Steve's suggestion isn't going to be viable for you. The # portion of the URL is not available to your code server-side, so you won't be able to determine where the rel canonical should point.
Furthermore, if they are committed to keeping the flash for now, and all as a single unit so one URL (the homepage), then you are going to have to accept that some juice intended for subpages is going to go to the homepage. You cannot do anything about that aspect, so you need to focus on the rest of the problem. However, whilst far from ideal, at least the juice is hitting the site somehow.
So… what to do?
Firstly, I'd start getting into the mindset of thinking in terms of the HTML site as the main/canonical site, and the Flash site as the 'enhanced experience' version. In this way, the HTML version is going to be the version that should be crawled by Google, and should be linked to.
Actions:
- Setup detection for mobile user-agents (out of preference I'd say all, but at least those known not to support flash, such as iPhone/iPad) and search engine bots, and ensure they get served the HTML version. Currently your homepage requires a click through on iPad offering an impossible Flash download, why not serve them the HTML page off the bat.
Is this cloaking? No! The HTML version is the main version, remember? It's no more cloaking than if you detected the user agent and then chose to serve the Flash version to Googlebot.
I actually discussed this with Jane Copeland at the fantastic Distilled link building event a couple of weeks back, and she agreed with me and said if it would stand up to a manual inspection then it is the right course of action.
-
Get all links in articles, press releases, directories or whatever else that are linking to specific pages and are originating from in house (or any source you have control over) to link to the HTML pages.
-
If the user arrives, has Flash and has arrived to an HTML link, you can now redirect to the Flash link for that page so they get the 'enhanced experience'. Don't use a 301 redirect -- remember the HTML version is the main version!
-
If the user arrives via a Flash link, but doesn't have Flash, but does have javascript you can detect the # variable and redirect them to the HTML page to help them along.
-
Educate the relevant stakeholders regarding point 2. I see you have a 'flashmode=0' option, tell them about this and how to use it get the URLs they need.
So where does this leave us?
-
The search engines can crawl all your lovely content, and they can ignore the flash version completely.
-
You are getting inbound links to specific pages. These pages have their own titles and meta descriptions… and content! Because they are the real site!
-
Users with Flash arriving via these links are landing on the correct Flash page of the site and are experiencing the rich site that you want them to.
-
Users arriving without Flash are getting the correct page if they arrive via an HTML URL. If they arrive via a Flash url then they get the correct page if they have javascript on (e.g iPad users), or they get the fallback of the homepage (rare).
I had a client with an almost identical situation, and I rolled out an almost identical solution to this, and they got crawled very quickly, shot up in Google and have stayed there for months.
Hope it helps. Let us know how you get on!
-
It's definitely a drag to have your links diluted between 2 versions of the site. There are a few solutions you can use, but the easiest would probably be to start using the rel=canonical tag on the flash version which points back to the same or similar page on the HTML site. That way, the engines know that the version you want indexed is the HTML version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Html extensions
I have remodeled an old html site using wordpress. I see some instructions in wordpress that says I can add an .html extension to some of the pages, but it looks pretty complicated. Is there any benefit in going through that hassle? or should I just ask my web guy to rewrite via htaccess | https://sacramentotop10.com/Weddings/Dresses.html | https://sacramentotop10.com/Weddings/Dresses.html becomes https://sacramentotop10.com/weddings/dresses
Technical SEO | | julie-getonthemap0 -
Migrating Http Site to Https Version
Hello, This coming weekend we will be changing our http sites to https versions. I have a very quick question regarding Google Search Console. Because the migration is happening over a weekend, we want to get as much as possible setup beforehand. Is there any risk to adding the new properties to the search console without the sites being live yet? I want to deliver the Search Console verify files to our IT team in advance for them to add to the site, and then once I get the okay that the migration went successfully, I would go into the Search Console and click on the Verify button to get the sites verified and of course, then fetch as Google to help speed up indexing a bit and ensure there are no errors. Any insight on this would be greatly appreciated! Amiee
Technical SEO | | Amiee0 -
We lost ranking for balers keywords for sinobaler.com site, What could be wrong?
From recent marketing, I have checked the backlinks droped down a lot, and some main keywords ranking also drop. How can fix this issue?
Technical SEO | | SINOBALER_Baler0 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
How to turn WP site into Ecom site?
I have a couple of old wordpress sites that are old affiliate blogs. I currently sell products that are on amazon on these sites and sell quite a bit of volume. I have found a source and can afford the inventory to replace Amazon with my own product. So the dilema is how to turn these wordpress sites into ecommerce sites. The thing I am worried most about is that each site gets about 100-200 visitors a day for great buying keywords. I obviously don't want to lose my rankings. What are the options of turning a wordpress site into a store. I am not interested in plugins or some of the other solutions that make the store look very cheap and I would assume horribly convert. If you have inner pages ranking for keywords how does that work? Do the post pages become product pages? So to sum up I guess I am asking, what are the options that are of the higher quality that will also help me keep my rankings? Thanks
Technical SEO | | PEnterprises0 -
Site revision
our site has complete redesign including site architecture, page url and page content (except domain). It looks like a new site. The old site has been indexed about thirty thousand results by google. now what should i do first?
Technical SEO | | jallenyang0 -
Site Architecture Trade Off
Hi All I'm looking for some feedback regarding a site architecture issue I'm having with a client. They are about to enter a re-design and as such we're restructuring the site URLs and amending/ adding pages. At the moment they have ranked well off the back of original PPC landing pages that were added onto the site, such as www.company.com/service1, www.company.com/service2, etc The developer, from a developer point of view wished to create a logical site architecture with multiple levels of directories etc. I've suggested this probably isn't the best way to go, especially as the site isn't that large (200-300 pages) and that the key pages we're looking to rank should be as high up the architecture as we can make them, and that this amendment could hurt their current high rankings. It looks like the trade off may be that the client is willing to let some pages be restructured so for example, www.company.com/category/sub-category/service would be www.company.com/service. However, although from a page basis this might be a solution, is there a drawback to having this in place for only a few pages rather than sitewide? I'm just wondering if these pages might stick out like a sore thumb to Google.
Technical SEO | | PerchDigital1 -
Can I noindex most of my site?
A large number of the pages on my site are pages that contain things like photos and maps that are useful to my visitors, but would make poor landing pages and have very little written content. My site is huge. Would it be benificial to noindex all of these?
Technical SEO | | mascotmike0