Given the lastest Google update, should I rewrite my Flash site or try to present an alternative HTML/CSS site?
-
I have a site that was created using Flash. The reasoning behind this was, at the time, that I didn't care if the site ranked or not (portfolio site). Now I would like to drive traffic to the site from SE's. Given the Penguin update, should I rewrite my Flash site in HTML/CSS or present an alternative site for bots and browsers that don't support Flash?
My concern is that by presenting an alternative site to bots and non Flash supporting browsers that the SE's will see potentially see this as cloaking. Thoughts and advice would be much appreciated.
-
From a designers point of view the amount of time and effort that will go into creating and presenting an alternative site for google bots and browsers would be the same as just creating a new functional website that is Search engine Friendly.
I would rather create a new website using XHTML with CSS which gets picked up in search engines really well as the content and styling code are handled on separate pages. Its important to keep the styling and coding separate as bots read the contents and styling codes that are on the page just act as a barrier.
don't forget to use H1 tags for your targeted keywords and if you decide to use a flash movie - you can add additional css to bring up an still image for browsers that don't support flash.
-
Hi,
This link may answer some of your questions about flash, I am in no way a flash expert, but this page does cite the sources (and one of them is dreclty from Google on the matter
http://uxmyths.com/post/717781129/myth-18-flash-is-evil
http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html
Presenting the site differently is not an example of malicious or manipulative cloaking, but that does not mean you won't get "lumped" into this category since technically flash in its current form has been indexed since 2008 (per Google's blog)
In my personal opinion, from a UX and search engine standpoint I prefer html php asp - also feel overall they are more forward compatible.
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links not visible in "Google cache text version" but visible in "Fetch as Google" in Webmaster tool
Hi Guys, There seems some issue with the coding due to which Google is not indexing half of our menu bar links. The cached text version of http://www.99acres.com/ is not showing links present in dropdown "All India" , dropdown "Advice" and "Hot Projects" tab in blue bar on top menu whereas these links are visible in "Fetch as Google" in Google Webmaster tool. Any clue to why is there a difference between the links shown in Google webmaster and Google cache text version. Thanks in advance 🙂
Web Design | | vivekrathore0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Blog vs. News/Editorial Layout?
We're in the coupon blogging space & saw that one of the larger coupon sites move away from the more traditional blog layout: http://thekrazycouponlady.com they now have more of an editorial type layout. Here is another site which is more similar to our layout: http://hip2save.com. So here are my questions: Which layout type do you feel better serves their visitors & why? How does the affect the SEO of the site? How does it affect the advertising revenues? Which layout do you prefer? Is there strategy in this move for the coupon blog, or is this just a preference on how they now display their content? We're making some updates to our design soon & I wanted to get some feedback on the overall direction we take.
Web Design | | seointern0 -
Avoiding duplicate content with multi-lagusage site
Hi, We have a client in China that is looking to create three versions of the same website, English, Chinese and Korean. They do not want to use a translation plugin like Google translate, preferring to have the pages duplicated. What is the best way to do this bearing in mind that the site needs to be found in all three languages. Would also appreciate if anyone knows of a good hosting company that has English support on the Chinese main land. Thanks Fraser
Web Design | | fraserhannah0 -
International SEO issues for multiple sites
We currently have 3 websites: oursite.co.uk oursite.fr oursite.ch We also own Oursite.com, and that URL currently redirects to Oursite.fr. We are considering a complete site redesign and a possible merge of the 3 sites. Assumptions: ** the 3 sites currently receive organic search traffic to varying degrees
Web Design | | darkgreenguy
** Oursite.ch is almost identical to Oursite.fr in terms of the site content
** Our target market is NOT the USA for English-language searches. It is the UK. With a re-design, we see our options as follows: Merge the 3 sites and make Oursite.com the "main site" and then have subfolders as follows: /uk /fr /ch Keep the 3 sites as they are. We see Option 1 as the best in terms of saving time when updating the site, and saving money paid to the site developers (1 site vs 3 sites). We see Option 2 as the best in terms of ability of the site to rank, as well as confidence of searchers when seeing our site in the search results (in other words, a person searching in France would be more likely to buy and/or submit a form on our site if they saw Oursite.fr vs Oursite.com/fr). I guess we're looking for some suggestions/guidance here. Are we missing any big issues? Does anyone have experience with an issue such as this? Thank you in advance...
-Shawn0 -
Mobile Sitemap for Site with Media Queries
I'm doing SEO for a site. It uses Media Queries and the CSS to automatically resize the site for the screen size in use. I.e. the site detects the screen size of say an iPhone and the CSS knows which elements to hide for that screen size and still make it look good. This is great because it will automatically cut down the content to display nicely on small screens - obviating the need for a separate mobile site. What kind of sitemap should be generated since the urls are for desktop and mobile use? Yoast (sweet SEO) said it should have both regular and mobile style sitemap to get both the regular and mobile bots to visit, but didn't elaborate on how that sitemap should look. Do you have a recommendation for how exactly the sitemap should look? Should the sitemap have the urls all twice, i.e. once regular and once with the mobile indicator?
Web Design | | GregoryHaze1 -
Landing Page/Home Page issues
Hi. I was speaking with my designer last night (we are setting up a new website) and we were discussing the design of our homepage, now the designer said he wanted the first page of the website to be a sort of landing page page were the visitor has to click and enter, im sure everyone has all come across these before. However, I am concerned as to the SEO implications of this? Any help guys?
Web Design | | CompleteOffice0 -
IP block in Google
Our office has a number of people performing analysis and research on keyword positions, volume, competition etc. We have 1 external static IP address. We installed the static IP so we can filter out our visits in Google Analytics. However by 10 AM we get impssible CAPTCHA's or even get blocked in Google. Do you have any experience with such an issue? Any solutions you can recommend? Any help would be appreciated! SXI5A.png
Web Design | | Partouter0