Indexing of flash files
-
When Google indexes a flash file, do they use a library for such a purpose ? What set me thinking was this blog post ( although old ) which states -
"we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1."
-
Seriously you should be looking forward to my blog post (currently they need to review it for technical accuracy, which may take some time) but it is a post about how to make your Flash website as SEO friendly as possible...but until then I can only tell you, "Even the good-old HTML , we need A LOT of optimization to make it PERFECT for indexing..." , i have tried so many ways to make my Full Flash website SEO friendly, and let me tell you, none of them are as powerful as any optimized HTML page
That is why in my blog post here, I am going to explain how to make your Flash website as SEO friendly as those normal optimized HTML page (and it really works)
Hopefuly the blog post is approved in time
-
It's not something for the faint of heart you need to know how to program some server side language and javascript. But armed with that knowledge (or a good front end developer) you should be able to do it. For more technical bable on making a snapshot see this page (not that it's talking about ajax and not what I mentioned but the technique is the same)
http://code.google.com/web/ajaxcrawling/docs/html-snapshot.html
I spoke about this subject in a previous thread, but basically you need to make a HTML5 website and check the browser for html5 support, if support is present show the site, if not show a flash version.
-
Thanks for clarification.
"HTML5 snapshot of the site, and serve this to google" How can we do this ? Can you please elaborate
-
what they are saying is that they are working with adobe to make the google bot better at crawling flash sites. but I would refer to googles help pages on this subject. On the other hand I would make a HTML5 snapshot of the site, and serve this to google and to iPad/iPhone users. that way you'll make google happy and not use the i-Phone/pad people.
But yes there are technologies out there, that you can use to help google crawl your site. But my guess is that they are doing exactly what I mentioned above (just only for google).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing .com and .co.uk site
Hi, I am working on a site that is experiencing indexation problems: To give you an idea, the website should be www.example.com however, Google seems to index www.example.co.uk as well. It doesn’t seem to honour the 301 redirect that is on the co.uk site. This is causing quite a few reporting and tracking issues. This happened the first time in November 2016 and there was an issue identified in the DDOS protection which meant we would have to point www.example.co.uk to the same DNS as www.example.com. This was implemented and made no difference. I cleaned up the htaccess file and this made no difference either. In June 2017, Google finally indexed the correct URL, but I can’t be sure what changed it. I have now migrated the site onto https and www.example.co.uk has been reindexed in Google alongside www.example.com I have been advised that the http needs to be removed from DDOS which is in motion I have also redirected http://www.example.co.uk straight to https://www.example.com to prevent chain redirects I can’t block the site via robot.txt unless I take the redirects off which could mean that I lose my rankings. I should also mention that I haven't actually lost any rankings, it's just replaced some URLs with co.uk and others have remained the same. Could you please advise what further steps I should take to ensure the correct URL’s are indexed in Google?
Technical SEO | | Niki_10 -
Get List Of All Indexed Google Pages
I know how to run site:domain.com but I am looking for software that will put these results into a list and return server status (200, 404, etc). Anyone have any tips?
Technical SEO | | InfinityTechnologySolutions0 -
Why is Google not indexing my site?
I'm a bit confused as to why my site just isn't indexing on Google. Even if I type in my brand name, my social channels rank and there's no evidence of my website. I've followed all of the advice I've read and gone into webmaster tools and got the Wordpress yoast plug-in but nothing seems to be making a difference!One thing I've noticed, in Google Webmaster Tools it says "Couldn’t communicate with the DNS server." in site errors. I've called GoDaddy and they said that everything is fine. A bit frustrating. Trying to work out what my next steps should be but feeling a bit lost to be honest! Any help GREATLY appreciated!
Technical SEO | | j1066s0 -
Odd scenario: subdomain not indexed nor cached, reason?
hi all hopefully somebody can help me with this issue 🙂 6 months ago a number of pages hosted at a domain level have been moved to a subdomain level with 301redirects + some others were created from scratch ( at a subdomain level too). what happens is that not only the new urls at the subdomain level are not indexed nor cached, but the old urls are still indexed in google, although by clicking on them they bring to the new urls via 301 redirect. question is why having a 301 redirects to the new urls, no issues with robot.txt, metarobots etc, the new urls are still de-indexed? i might remind you that a few (100 pages or so) have been created from scratch, but they are also not indexed. the only issue found across the page is the no-cache line of code set as follow: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache i am not familiar with cache control lines. Can this be an issue from a correct indexing? thanks in advance Dario
Technical SEO | | Mrlocicero0 -
Should I put meta descriptions on pages that are not indexed?
I have multiple pages that I do not want to be indexed (and they are currently not indexed, so that's great). They don't have meta descriptions on them and I'm wondering if it's worth my time to go in and insert them, since they should hypothetically never be shown. Does anyone have any experience with this? Thanks! The reason this is a question is because one member of our team was linking to this page through Facebook to send people to it and noticed random text on the page being pulled in as the description.
Technical SEO | | Viewpoints0 -
Does Google index has expiration?
Hi, I have this in mind and I think you can help me. Suppose that I have a pagin something like this: www.mysite.com/politics where I have a list of the current month news. Great, everytime the bot check this url, index the links that are there. What happens next month, all that link are not visible anymore by the user unless he search in a search box or google. Does google keep those links? The current month google check that those links are there, but next month are not, but they are alive. So, my question is, Does google keep this links for ever if they are alive but nowhere in the site (the bot not find them anymore but they work)? Thanks
Technical SEO | | informatica8100 -
Image Sitemap Indexing Issue
Hello Folks, I've been running into some strange issues with our XML Sitemaps. The XML Sitemaps won't open on a browser and it throws the following error instead of opening the XML Sitemap. Sample XML Sitemap - www.veer.com/sitemap/images/Sitemap0.xml.gzError - "XML Parsing Error: no element foundLocation: http://www.veer.com/sitemap/images/Sitemap0.xmlLine Number 1, Column 1:"2) Image files are not getting indexed. For instance, the sitemap - www.veer.com/sitemap/images/Sitemap0.xml.gz has 6,000 URLs and 6,000 Images. However, only 3,481 URLs and 25 images are getting indexed. The sitemap formatting seems good, but I can't figure out why Google's de-indexing the images and only 50-60% of the URLs are getting indexed. Thank you for your help!
Technical SEO | | CorbisVeer0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90