Indexing of flash files
-
When Google indexes a flash file, do they use a library for such a purpose ? What set me thinking was this blog post ( although old ) which states -
"we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1."
-
Seriously you should be looking forward to my blog post (currently they need to review it for technical accuracy, which may take some time) but it is a post about how to make your Flash website as SEO friendly as possible...but until then I can only tell you, "Even the good-old HTML , we need A LOT of optimization to make it PERFECT for indexing..." , i have tried so many ways to make my Full Flash website SEO friendly, and let me tell you, none of them are as powerful as any optimized HTML page
That is why in my blog post here, I am going to explain how to make your Flash website as SEO friendly as those normal optimized HTML page (and it really works)
Hopefuly the blog post is approved in time
-
It's not something for the faint of heart you need to know how to program some server side language and javascript. But armed with that knowledge (or a good front end developer) you should be able to do it. For more technical bable on making a snapshot see this page (not that it's talking about ajax and not what I mentioned but the technique is the same)
http://code.google.com/web/ajaxcrawling/docs/html-snapshot.html
I spoke about this subject in a previous thread, but basically you need to make a HTML5 website and check the browser for html5 support, if support is present show the site, if not show a flash version.
-
Thanks for clarification.
"HTML5 snapshot of the site, and serve this to google" How can we do this ? Can you please elaborate
-
what they are saying is that they are working with adobe to make the google bot better at crawling flash sites. but I would refer to googles help pages on this subject. On the other hand I would make a HTML5 snapshot of the site, and serve this to google and to iPad/iPhone users. that way you'll make google happy and not use the i-Phone/pad people.
But yes there are technologies out there, that you can use to help google crawl your site. But my guess is that they are doing exactly what I mentioned above (just only for google).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
De-indexing and SSL question
Few days ago Google indexed hundreds of my directories by mistake (error with plugins/host), my traffic dropped as a consequence. Anyway I fixed that and submitted a URL removal request. Now just waiting things to go back to normality. Meantime I was supposed to move my website to HTTPS this week. Question: Should I wait until this indexing error has been fixed or I may as well go ahead with the SSL move?
Technical SEO | | fabx0 -
Page missing from Google index
Hi all, One of our most important pages seems to be missing from the Google index. A number of our collections pages (e.g., http://perfectlinens.com/collections/size-king) are thin, so we've included a canonical reference in all of them to the main collection page (http://perfectlinens.com/collections/all). However, I don't see the main collection page in any Google search result. When I search using "info:http://perfectlinens.com/collections/all", the page displayed is our homepage. Why is this happening? The main collection page has a rel=canonical reference to itself (auto-generated by Shopify so I can't control that). Thanks! WUKeBVB
Technical SEO | | leo920 -
How can I index several systems used for my website?
My site is built on PHP, but has a help.website.com page based on a helpdesk platform. I also have a wordpress blog. So, these are three "different systems" under the same domain. When I crawl my site, neither the blog nor the help page show up. How can I make them show up? Thanks!
Technical SEO | | rodelmo880 -
AJAX and High Number Of URLS Indexed
I recently took over as the SEO for a large ecommerce site. Every Month or so our webmaster tools account is hit with a warning for a high number of URLS. In each message they send there is a sample of problematic URLS. 98% of each sample is not an actual URL on our site but is an AJAX request url that users are making. This is a server side request so the URL does not change when users make narrowing selections for items like size, color etc. Here is an example of what one of those looks like Tire?0-1.IBehaviorListener.0-border-border_body-VehicleFilter-VehicleSelectPanel-VehicleAttrsForm-Makes We have over 3 million indexed URLs according to Google because of this. We are not submitting these urls in our site maps, Google Bot is making lots of AJAX selections according to our server data. I have used the URL Handling Parameter Tool to target some of those parameters that are currently set to let Google decide and set it to "no urls" with those parameters to be indexed. I still need more time to see how effective that will be but it does seem to have slowed the number of URLs being indexed. Other notes: 1. Overall traffic to the site has been steady and even increasing. 2. Google bot crawls an average of 241000 urls each day according to our crawl stats. We are a large Ecommerce site that sells parts, accessories and apparel in the power sports industry. 3. We are using the Wicket frame work for our website. Thanks for your time.
Technical SEO | | RMATVMC0 -
Skip indexing the search pages
Hi, I want all such search pages skipped from indexing www.somesite.com/search/node/ So i have this in robots.txt (Disallow: /search/) Now any posts that start with search are being blocked and in Google i see this message A description for this result is not available because of this site's robots.txt – learn more. How can i handle this and also how can i find all URL's that Google is blocking from showing Thanks
Technical SEO | | mtthompsons0 -
Does server affect indexing speeds?
A bit of a strange question this one: I have a domain which, when on my Dutch server, can get new blog posts indexed and ranking in less than 10 mins using the pubsubshubbub plugin. However, I moved the blog and domain to a UK dedicated server and continued to post. Days later none of these posts were indexed. I then moved the domain back to the Dutch server to test this, I posted in the blog and once again, indexed and ranking in 20 mins or so. To cut a long and tedious story short; In a bid to be closer to my customers I moved the domain to a UK VPS three days back. I posted but no posts are indexed. Anyone else experienced anything like this? Generally I don't move domains back and forward so much but wanted to test this out. The Ducth server is a 16 core 24gb Direct Admin dedicated, the two UK servers were both running Cpanel. I understand that it would be best to host as close to possible to the customers but the hardship of getting posts indexed in the UK is becoming a problem. Thanks, Carl
Technical SEO | | Grumpy_Carl1 -
Same image file with different alt text?
I have an image that represents 'widgets'. The image works for more than one kind of widget. I have two pages, one optimized for 'blue widgets' and one optimized for 'red widgets'. I would like to use the same 'widgets' image on both pages but change the alt text to be 'blue widgets' or 'red widgets' depending on the page it is used on. Should I: (1) use the same image on different pages with different alt text. (2) duplicate the image file and have two copies 'red_widgets.jpg' and 'blue_widgets.jpg' and then use each copy on the page optimized for the corresponding phrase. (3) create distinct, unique image files (where the pixels are different, not just the file names) for each kind of widget. This is a simplified example of a larger SEO problem where I have 1 image that can be useful on 20 pages that are each optimized for 20 different phrases. Should I use the same image with 20 different alt tags, or create 20 identical (but renamed) copies of the image, or create 20 slightly different image files (with different pixels in each image)? Thanks.
Technical SEO | | scanlin0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0