Login webpage blocked by robots
-
Hi, the SEOMOZ crawl diagnostics shows that this page:
www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow)
Is there any problem with that?
-
thanks!
-
thanks!
-
Unless you have relevant information for your users on the log in page (i.e. for your private use) then it's probably a good idea not to index it!
-
Nope, that's perfectly fine since that's your login page for Wordpress.
If you're linking to the page from anywhere on your site (which you really shouldn't be), you could update the meta robots tag to (noindex, FOLLOW), but since it looks like the page has no links, it shouldn't be necessary.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to block http://www.site.com/members/...
How do i block http://www.site.com/members/....name/activity/3202 and many more like this from getting spider showing up as duplicate in moz Regards Tai
On-Page Optimization | | Taiger0 -
Magento Canonical & Default Robots Settings
Hello! I'm working with Magento 1.9 for an eCommerce site with several hundred products. Currently I understand it is best practices to use the Canonical tag, however I also have my default robots set to "Index, Follow". Will this cause an issue having product pages set to index, follow but also having a canonical tag included? What are some best practices regarding Magento default robots & canonical tags? Any help is appreciated.
On-Page Optimization | | BretDarby0 -
One Webpage per Topic or splitting up for better reading...?
What is better from a SEO-Point of View: I am building right now a website where the principal topic is Renewable Energies. There will be a menu listing all kinds of Energy-types: Biogas CSP Biomass etc. And now my question: Each Topic has about 800-1000 Words of unique content with sub-topics. I think its certainly good to have for each energy type one separate page. But I think its no a good Idea to split also the subtopics up to further sub-pages like: www.energy.com/renewable-energies-biomass.html www.energy.com/renewable-energies-biomass-eficiency.html www.energy.com/renewable-energies-biomass-market.html www.energy.com/renewable-energies-biomass-industries.html as 1000 Words on one page may look like better higher quality content than making 3-4 pages with just 200 Words... talking about Biomass, but from several points of views. So I think its better to put all about Biomass on one single-page and use a menu just to jump to the subtopics via anchor-tags. Right? 🙂 Thanks Kate and Charles! Meanwhile I found out whats the right term for my question: "Pagination" I read about using the rel="next" and rel="prev" attribute when paginating an article over different pages.
On-Page Optimization | | inlinear
MY DOUBT: Sometimes you see single page paginated by using javaScript that hides text although all is in the page source, for better reading. Does Google like that or might think it could be hidden text with spamming purpose? So I think using old school "named anchors" to divide text into topics (for text about 1000 words) is better than using javaScript that reaveals text via pagination or expand collapse.0 -
Blocking Pages E-Commerce Site
Hello, I am working on a site with 1,000's of product pages, some of which do not have inventory in them. Should be blocking these pages in order to reduce bouce rate? How could i manage so many pages efficiently? It would takes weeks to comb through pages to determine which have inventory and which do not. They are also time sensitive as they are live events so dates are always changing. Thanks!
On-Page Optimization | | TP_Marketing0 -
Disallow a spammed sub-page from robots.txt
Hi, I have a sub-page on my website with a lot of spam links pointing on it. I was wondering if Google will ignore that spam links on my site if i go and hide this page using the robots.txt Does that will get me out of Google's randar on that page or its useless?
On-Page Optimization | | Lakiscy0 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0