Does google can read the content of one Iframe and use it for the pagerank?
-
Beginners doubt:
When one website has its content inside Iframe's, google will read it and consider for the pagerank?
-
Content has no "direct" role to play in affecting the PageRank. PR is calculated via backlinks. Further, if you want your content to be crawled , indexed and rank in search engines - stay away from frames/iframes.
-
sorry, i didn't saw it...
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalized By Google
My site name is bestmedtour .it's in English. I also want to have the Arabic version of the site. If I translate it with Google Translate, is it possible that the Arabic version of the site will be penalized?
Intermediate & Advanced SEO | | aalinlandacc0 -
Our subdomain hosts content that can not be optimized (static videos) - should I de-index it?
We host static tours on a subdomain that points to the video tour host. I can not add meta or optimize any of these video pages and there are thousands. Should I de-index the entire subdomain? I am seeing errors for no meta, dup content etc in MOZ reporting. If yes, do I add the Disallow: /subdomain.root.nl/ to the primary domain's website/CMS or in DNS records ? Our web company is saying the no follow needs to be added in DNS but I feel like it should be added to the robots.txt file if SERP's are going to acknowledge the sub is no longer to be crawled and the primary is no longer to be penalized. Thank you so much in advance!
Intermediate & Advanced SEO | | masonmorse0 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Fetch as Google
I have odd scenario I don't know if anyone can help? I've done some serious speed optimisation on a website, amongst other things CDN and caching. However when I do a Search Console Fetch As Google It is still showing 1.7 seconds download time even though the cached content seems to be delivered in less than 200 ms. The site is using SSL which obviously creams off a bit of speed, but I still don't understand the huge discrepancy. Could it be that Google somehow is forcing the server to deliver fresh content despite settings to deliver cache? Thanks in advance
Intermediate & Advanced SEO | | seoman100 -
Are there any issues with search engines (other than Google/Bing) reading Protocol-Relative URLs?
Are there any issues with search engines (other than Google/Bing) reading Protocol-Relative URLs? Specifically with Baidu and Yandex?
Intermediate & Advanced SEO | | WikiaSEO0 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
Can you be penalized by a development server with duplicate content?
I developed a site for another company late last year and after a few months of seo done by them they were getting good rankings for hundreds of keywords. When penguin hit they seemed to benefit and had many top 3 rankings. Then their rankings dropped one day early May. Site is still indexed and they still rank for their domain. After some digging they found the development server had a copy of the site (not 100% duplicate). We neglected to hide the site from the crawlers, although there were no links built and we hadn't done any optimization like meta descriptions etc. The company was justifiably upset. We contacted Google and let them know the site should not have been indexed, and asked they reconsider any penalties that may have been placed on the original site. We have not heard back from them as yet. I am wondering if this really was the cause of the penalty though. Here are a few more facts: Rankings built during late March / April on an aged domain with a site that went live in December. Between April 14-16 they lost about 250 links, mostly from one domain. They acquired those links about a month before. They went from 0 to 1130 links between Dec and April, then back to around 870 currently According to ahrefs.com they went from 5 ranked keywords in March to 200 in April to 800 in May, now down to 500 and dropping (I believe their data lags by at least a couple of weeks). So the bottom line is this site appeared to have suddenly ranked well for about a month then got hit with a penalty and are not in top 10 pages for most keywords anymore. I would love to hear any opinions on whether a duplicate site that had no links could be the cause of this penalty? I have read there is no such thing as a duplicate content penalty per se. I am of the (amateur) opinion that it may have had more to do with the quick sudden rise in the rankings triggering something. Thanks in advance.
Intermediate & Advanced SEO | | rmsmall0 -
How can we get a site reconsidered for Google indexing?
We recently completed a re-design for a site and are having trouble getting it indexed. This site may have been penalized previously. They were having issues getting it ranked and the design was horrible. Any advise on how to get the new site reconsidered to get the rank where it should be? (Yes, Webmaster Tools is all set up with the sitemap linked) Many thanks for any help with this one!
Intermediate & Advanced SEO | | d25kart0