Duplicate Errors from Wordpress login redirects
-
I've some Duplicate issues showing up in Moz Analytics which are due to a Q&A plugin being used on a Wordpress website which prompts the user to login. There's a number of links looking like the one shown below, which lead to the login page:
www.website.com/wp-login.php?redirect_to=http%3A%2F%2Fwww.website.com%question%2....
What's the best way to deal with this?
--
extra info:
-
this is only showing up in Moz Analytics. Google Webmaster Tools reports no duplicates.. I'm guessing this is maybe down to the 'redirect_to' parameter being effective in grouping the URLs for Googlebot.
-
currently the wplogin and consequent redirects are 'noindex, follow' - I cannot see where this is being generated from in wp-login.php to change this to nofollow (if this will solve it).
-
-
Yea I'd already blocked some duplicates from a BuddyPress issue, so I didn't want to just jump in and block straight away without some further investigation. Good to know that's the best solution to keep things clean. Cheers for answering Dan.
-
Greg
That's right, the best way is to block crawling with robots.txt - makes sense to keep crawling clean and efficient. If you're using Yoast you can edit robots.txt right in there, or you can do via FTP.
-
Thanks Dan, that's really helpful...
Webmaster Tools reports no crawl issues or anything strange and the crawled pages matches the site size. I've performed a Screaming Frog crawl with the suggested settings, and it IS seeing those redirected pages - 48 in total - which matches the number Moz Analytics is reporting.
The actual page these redirected URL's end up at is a CATCHA page - This is an unneeded layer of 'extra' security put in place by the hosting company after the spate of Wordpress hacks last year. (Cookie'd users who have recently passed the CATCHA would arrive at the Wordpress login page) As such, we don't have any control over the code on that page or anything.
So I guess that even though WMT isn't complaining about these duplicates, to keep things clean and tidy then blocking with robots.txt is the solution huh.
-
Greg
Generally if you're not seeing this in Webmaster Tools or Screaming Frog (have you tried a crawl there yet?) then it's probably not an issue. Crawl it with Screaming Frog, and if you maintain the default settings (honor robots.txt and don't follow nofollows) and set to Googlebot, this will be a pretty accurate representation of what Google is doing. If the pages don't pop up, you should be fine.
Also, check webmaster tools for "crawl stats" - on average, is Google crawling an abnormal amount of pages compared to the "normal" site size?
If it is a problem, you can always block them with robots.txt
-Dan
-
Always for Login and Logout pages use ( NOINDEX - NOFOLLOW ) so you will not face problems like this again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
301 Redirects?
We have an e-commerce website with about 4500 products for sale. About 1200 of these items were not showing up in the Google PLA ads because they were $0 dollar items, so we made those products invisible. Then Set 301 Redirects for each of the 1200 items. My question is this; we want to turn back on the 1200 items, should we delete the 301 redirects that are in place for them.? Will it hurt SEO performance by having them?
Intermediate & Advanced SEO | | Goriilla0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
Partner Login as subdomain?
Hi MozTeam, We have a website that is used as our partner login for our Partners to see their stats, but it is located on a SEPARATE domain from our main corporate website. We currently have thousands of people logging into the external portal every month, which we are obviously not getting good SEO credit for. I am considering bringing the entire login portal into our main corporate website, so that Google sees how popular and useful our site becomes when thousands more people are visiting... We only get a few thousands organic visits to the corporate site per month and about 3x that to the partner login portal. This is why I originally thought we could benefit from bringing it into our corporate site. Challaneges: our website is in .asp but we are launching a new version of it next month, switching it to Wordpress and into .php....but the current partner login website is still in .asp! Questions: 1. How will bringing this site into the main corporate site benefit us as far as SEO? 2. What is the proper way to combine an .asp site with a .php site? 3. If we have to use an iFrame because we can't mix the two languages, will that affect our SEO benefit? Pls advise, as if this is actually a good idea, I'd like to get it launched along with the site redesign that is currently under way.
Intermediate & Advanced SEO | | DerekM880 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Managing 404 errors
What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?
Intermediate & Advanced SEO | | SEOProPhoto0