Are Google now indexing iFrames?
-
A client is pulling content through an iFrame, and when searching for a snippet of that exact content the page that is pulling the data is being indexed and not the iFrame page. Seen this before?
-
Yeah, I use iframes and if I want to be sure they are NOT indexed, I Just add a "noindex" tag. You may also want to add a "nofollow" tag to avoid spiders to follow links inside the frame. Using iframes may be a good idea to reduce the number of links on a page (Bruce Clay suggestion).
-
I've never seen it before, but like everyone here said, it's not a good idea.
This makes me wonder though:
1. Can you find the original page using a snippet? And if not:
2. Is the page contained in the iframe indexed? (Or better-phrased, is the page that is being framed "noindex"?)
It makes sense to me that if the framed page is noindex, that Google would index the content and attribute it to the page framing it.
One perfect example:
I embed videos using an iframe and then I make the video unlisted in YouTube. My embedded content is indexed and even displayed as a rich snippet....
-
I have noticed content within iFrames being indexed by google and text within those iFrames being attributed to the page/url that is hosting the iFrame. Not sure how often this applies. I avoid iFrames.
Merchant Circle uses them and their pages get credit for content in them.
-
It might have been covered but it does seem that google is ignoring iframes in relation to commets code posted on sites.for instance: our text cached version.: http://webcache.googleusercontent.com/search?q=cache:8IZ95GICp7AJ:gaveltek.com/seoblog/&hl=en&gl=us&strip=1
compare the page title to (use headers it easier)
www.gaveltek.com/seoblog the list "comments" and despite there being some the are not posted. However, I do believe general wordpress comments hold some weight. That is not to sayt that facebook comments do not, its just done via different metrics, like social, and trust, and egngagement.
Cheers
TODD
-
A good way to check is go to google.com and type in your full URL like this:
site:www.domain.com
Then you will be populated with your sites pages of course. Now there is a link there that says: "cache" and you can see what it cached.
I think they may be getting better at knowing what's in a iframe. Look at how many sites use facebook comments on the blog and how do you think thats ran? iframes. Do you remember google and adobe working together at reading .pdf's and flash.
The little magnifying glass has some cool technology behind it that I'm sure helped them know whats really on the site. Without getting to far off track I do feel like they are better at reading iframes. Just my .02c in this thread.
-
last thought... i've only ever used iframes in the aforementioned example. Not an ideal way to display your original content if you want it indexed.
-
It is very typical for Google to ignore iframes. I don't know the precise details of your situation but there are several reasons for iframing that might make sense - this is situational - so no hating!
-
you're an affiliate and using another offer (conversion form) that you have to iframe to generate leads, etc
-
you want to hide duplicate content that appears elsewhere on the site (although there are far more elegant ways to do this)
3)You're pulling video or other syndicated content from a publisher who wants to maintain control (ie not let you outrank them with their own content)
*** Remember that the iframed content can certainly be indexed but usually only from the destination URL's originating source. For example: You are www.insuranceaffilifate.com running an offer from www.insurance.com/form_1011 - you will most likely use insurance.com's form via iframe on your landing page. That form, unless it uses a NOINDEX meta tag, will likely be picked by the search engines from www.insurance.com but will be ignored on your site www.insuranceaffiliate.com.
Hope this helps.
-
-
I have to agree with Julich in that you should move the content to be truly located on www.domain.com instead of iframe.domain.com.
-
I totally agree that they shouldn't be using iFrames and it is part of my recommendations to them, but we need to work with what we have at the moment.
So just to clarify, you would say that www.domain.com which is pulling the data through from iframe.domain.com would rank?
Even though all the content except the navigation, footer, etc is on iframe.domain.com.
-
Normally, it would be www.domain.com (unless it doesn't provide any content outside the iFrame).
But it is not abnormal to also see iframe.domain.com in the SERPS, since it may have some backlinks pointing to it.
Anyway, using iframes is a weird technique and I recommend you merge those into www.domain.com if possible (and don't forget to do some 301 redirections to tell Google your pages have definitely moved to www.domain.com).
-
OK, so if www.domain.com was pulling through content from iframe.domain.com which domain would you expect to rank?
I would personally expect iframe.domain.com to rank as that is actually where the content is and the www.domain.com provides the link to that page. I am currently seeing both domains rank, which has lead me to ask the question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deindexed homepage by Google
I just noticed that my homepage was de-indexed by google. Any thoughts would be appreciated.
Technical SEO | | Jenny_H0 -
How can I tell Google not to index a portion of a webpage?
I'm working with an ecommerce site that has many product descriptions for various brands that are important to have but are all straight duplicates. I'm looking for some type of tag tht can be implemented to prevent Google from seeing these as duplicates while still allowing the page to rank in the index. I thought I had found it with Googleoff, googleon tag but it appears that this is only used with the google appliance hardware.
Technical SEO | | bradwayland0 -
Google Still Taking 2 - 3 Days to Index New Posts
My website is fairly new, it was launched about 3.5 months ago. I've been publishing new content daily (except on weekends) since then. Is there any reason why new posts don't get indexed faster? All of my posts gets +1's and they're shared on G+, FB and Twitter. My website's at www.webhostinghero.com
Technical SEO | | sbrault740 -
I disappeared from Google, but not Bing?
A few weeks ago the company I work for created a website called Nabceptraining.com in hopes to find clientele who are looking to become NABCEP Certified. So we installed WordPress and a nice looking theme that we edited, and we saw that without any real content on the page we were already rank 12 on Google! We never even submitted our site. So we decided to Search Engine Optimize our website to gain even better rankings. So on the first day we added an SEO Tool that would allow us to change the page titles and descriptions on our site, a Sitemap generation tool so we could submit the site to Google, Bing, and Yahoo. At the end of the day we submitted the site and was beginning to create our content. That next morning our ranking was gone and we were not to be found on Google, but we were rank 3 on Bing!? What happened? Why did we disappear? We didn't receive any messages from Google on the webmaster tools saying we were blacklisted. Does anyone have an idea?
Technical SEO | | edlinkim0 -
Website Migration - Very Technical Google "Index" Question
This is my understanding of how Google's search works, and I am unsure about one thing in specifc: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" connects to the "page directory". I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I ask is I am starting to work with a client who has a newly developed website. The old website domain and files were located on a GoDaddy account. The new websites files have completely changed location and are now hosted on a separate GoDaddy account, but the domain has remained in the same account. The client has setup domain forwarding/masking to access the files on the separate account. From what I've researched domain masking and SEO don't get along very well. Not only can you not link to specific pages, but if my above assumption is true wouldn't Google have a hard time crawling and storing each page in the cache?
Technical SEO | | reidsteven750 -
De-indexed from Google
Hi Search Experts! We are just launching a new site for a client with a completely new URL. The client can not provide any access details for their existing site. Any ideas how can we get the existing site de-indexed from Google? Thanks guys!
Technical SEO | | rikmon0 -
Google refuses to index our domain. Any suggestions?
A very similar question was asked previously. (http://www.seomoz.org/q/why-google-did-not-index-our-domain) We've done everything in that post (and comments) and then some. The domain is http://www.miwaterstewardship.org/ and, so far, we have: put "User-agent: * Allow: /" in the robots.txt (We recently removed the "allow" line and included a Sitemap: directive instead.) built a few hundred links from various pages including multiple links from .gov domains properly set up everything in Webmaster Tools submitted site maps (multiple times) checked the "fetch as googlebot" display in Webmaster Tools (everything looks fine) submitted a "request re-consideration" note to Google asking why we're not being indexed Webmaster Tools tells us that it's crawling the site normally and is indexing everything correctly. Yahoo! and Bing have both indexed the site with no problems and are returning results. Additionally, many of the pages on the site have PR0 which is unusual for a non-indexed site. Typically we've seen those sites have no PR at all. If anyone has any ideas about what we could do I'm all ears. We've been working on this for about a month and cannot figure this thing out. Thanks in advance for your advice.
Technical SEO | | NetvantageMarketing0 -
Existing Pages in Google Index and Changing URLs
Hi!! I am launching a newly recoded site this week and had a another noobie question. The URL structure has changed slightly and I have installed a 301 redirect to take care of that. I am wondering how Google will handle my "old" pages? Will they just fall out of the index? Or does the 301 redirect tell Google to rewrite the URLs in the index? I am just concerned I may see an "old" page and a "new" page with the same content in the index. Just want to make sure I have covered all my bases. Thanks!! Lynn
Technical SEO | | hiphound0