PDFs With No Index Contribute To Page Ranks?
-
I have a question I'm hoping you can help me with. If I upload a PDF and add a no index under the meta robots index so that the PDF doesn't appear in search results when I send people the link to this PDF, does it still contribute to my site traffic/ranking etc?
Basically we are deciding whether to put some PDFs with pricing options etc onto our website or on a google drive. We will be sending the links to potential clients. If visitors clicking on the link would still help with increasing traffic and increasing our google rank (without that PDF showing in results) we thought this might be the best solution.
-
Thanks for your insight there Martjin, super helpful.
Yes, the Google drive was what I thought we'd use if there was no SEO advantage to having people view/download on our site.
-
Hi,
There is a few things in your post that I want to help clarify:
- Just sending traffic and links to something doesn't always mean that it'll help drive your search rankings.
- Posting a PDF in Google Drive and linking to that won't help your site rank better, in the end the file is hosted on a Google owned domain, not yours. So as far as I can imagine it won't help you.
- Only if people would link to the actual PDF that is hosted on your site (including meta robots of noindex) it could help you. But only if that link is public for anybody to visit and not just something that you obviously send out in an email.
Martijn
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
Absolutely gigantic drop in Ranking overnight
The site in question has been operating for around 3 months and for the first two months was nowhere to be seen in google results (literally page 10, 11 etc for all but branding searches). Then through a lot of hard work and the help of MozPro I managed to build it to a respectable 6th-10th place for the main keyphrases. Overnight on Wednesday we suddenly dropped half a dozen places or so and were rooted firmly mid table page 2 for both these phrases. Last night however we suddenly dropped from 14th to the very last result on page 10! for the biggest keyphrase where it has slowly dropped through the day to currently be sitting in 8th on page 11. Nothing is showing in Dashboard, theres no reason that I know of to be penalised, however even if we were penalised then surely that would be the site as a whole and not just 1 keyphrase? The weird thing is, when you look at the 1st 2 pages of the search a number of items have changed, a competitor which closed down is suddenly sitting mid table while other newcomers that were doing well have dropped, some of the older faces have gone up suddenly. If I didnt know better I would say that google has suffered a timewarp and is now serving results from a month ago for this keyphrase. Ive now noticed that the MozCast is reporting very stormy weather last night, surely however this couldnt be down to a change in Algorithm etc, not to produce such a massive drop for a site that has followed the rules (I refuse to link with a site that even remotely could be thought of as spam). Someone please tell me that google has been glitching and all will be well when I wake in the morning? Peter
Algorithm Updates | | Pwhitfield0 -
Should plural keyword variations get their own targeted pages?
I am in the middle of changing a website from targeting just a single keyword on all pages to instead having each page target its own keyword/phrase. However, I'm a little conflicted on whether or not plural forms and other suffix (-ing) variations are different enough to get their own pages. SERP show different results for each keyword searched. Also, relevancy reports for the keywords score some differently and some the same. Is it best to instead use these as secondary and third level keywords on the same page as the main keyword for a page? See example below: OPTION A (Use each for different pages): Page 1 - Construction Fence Page 2 - Construction Fences Page 3 - Construction Fencing Page 4 - Construction Site Fence Page 5 - Construction Site Fences Page 6 - Construction Site Fencing ... OPTION B (Use as variations on same page): Page 1 - Construction Fence, Construction Fences, Construction Fencing Page 2 - Construction Site Fence, Construction Site Fences, Site Construction Fencing ... Any help is greatly appreciated. Thanks!
Algorithm Updates | | pac-cooper0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Sudden Page Rank Drop for Weight Loss Camp
My site BalanceME.com has been performing really well and continually climbs more and more but in the last 7 days we have had a drop this last week. We went from #2 for weight loss camp to #6, and even further on other terms. It is mid May and I cannot find any other explanation for the sudden drop. I am hoping someone could give me some sort if idea what could possibly be the answer. Recently we have had two press releases go out, target for keywords, and also installed images that navigate to new pages on the site. There have been few additions to the site, and nothing can explain the drop. Other companies in our group have not had issues with recent drops, and we use the same practices with the other organizations. Thank you for your time
Algorithm Updates | | FVdBeuken0 -
Puzzled by recent SERP results - what ranking factors cause this?
Hi mozzers, I have been using moz tools for a long while now for assessing SEO metrics with great success, but since recent Google algorithm updates I am seeing more and more SERPS that just simply don't make sense to me what so ever. The most startling recently was an assessment of one of the keywords my own site competes for "Link Building Services" The current No 1 position in Google.co.uk is held by http://www.napalit.org/ I invite all you seo experts out there to take a look at this site, look at it metrics in OSE compared with the "lower" competition and explain to me why it is No1. I would really like to know what ranking factors this site has that makes it "higher quality" and offer "better value" than the competition. I thought I understood what Google was trying to do with recent updates - get rid of non-value adding spammers and improve the quality of the search results. But now I am becoming more sceptical. Are they just making it impossible for us to make a difference by following good SEO practices so we all resort to paying for Adwords? I hope you guys out there can help me with this one and restore my faith. Thanks
Algorithm Updates | | websearchseo0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0