Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Lost with conical, nofollow noindex. Not sure how to use it on a dyanmic php site with multiple region select options
-
I have a site with multiple regions the main page after a region is selected is login.php but the regions are defined by ?rid=11 , 12, etc. These are being picked up as duplicate content but they are all different regions. As i hired external php coders to develop most of the site I am scared to start meddling with any of the raw code and would like some advise on how to not show these as duplicate content. should i use noindex nofollow or connical? if Connical how do i set it up on the main login.php page? p.s. i am an extreme nube to seo
-
Great! Look forward to see your results
-
Hi mememax,
Thank you for the simple yet detailed explanation, really helped me out a lot.
I will try the conical and let you know how it goes.
-
Hi Moby, first of all the difference between all the tags you mention:
- canonical: is a meta tag which indicates which page containing the same content is the original one, It may be used from one page to another to indicate which one is the main one or a as a self canonical indicating the same page where it is located, because you can avoid duplciates due to sessions id or other kind of tags (trackings and the like)
- noindex: is a meta tag which indicates to bots to not include that page to their index. It's used to avoid low quality pages to being shown in serps
- nofollow: is both a meta, used to avoid search engines to crawl all the links in one page, or is used attached to a link to indicate bots to not pass value to that link.
So if you got pages which are identical you want to use the canonical in both the original page (as a self canonical) and in the other oines pointing to the original one. If they're are almost identical. If they're not you may want to:
- noindex those duplicates
- add quality content to the other regions pages so google finds that they're not identical at all.
Maybe your regions pages are seen as identical due to:
- low content, an image and slight changes to the title tag is not enough to mark a page as different, try to add good content, or mash content from other pages.
- is the content accessible when not logged in? If not google is seeing identical pages because its bots cannot crawl the content (bots cannot login)
About the login page you may noindex it, but I don't see any hurry to make changes to that page.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Seo For Forum Sites
I have forum site.I've opened it 2 months ago.But there is a problem.Therefore my content is unique , my site's keyword ranking constantly changing..Sometimes my site's ranking drops from first 500.After came to 70s. I didn't make any off page seo to my site.What is the problem ?
Technical SEO | | tutarmi0 -
How does Google Crawl Multi-Regional Sites?
I've been reading up on this on Webmaster Tools but just wanted to see if anyone could explain it a bit better. I have a website which is going live soon which is going to be set up to redirect to a localised URL based on the IP address i.e. NZ IP ranges will go to .co.nz, Aus IP addresses would go to .com.au and then USA or other non-specified IP addresses will go to the .com address. There is a single CMS installation for the website. Does this impact the way in which Google is able to search the site? Will all domains be crawled or just one? Any help would be great - thanks!
Technical SEO | | lemonz0 -
Does posting an article on multiple sites hurt seo?
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences. The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time. Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content? I know the links are great just wondering if there is any other side effects especially when there are no links provided! Thank you!
Technical SEO | | Scratch_MM0 -
Same Video on Multiple Pages and Sites... Duplicate Issues?
We're rolling out quite a bit of pro video and hosting on a 3-party platform/player (likely BrightCove) that also allows us to have the URL reside on our domain. Here is a scenario for a particular video asset: A. It's on a product page that the video is relevant for. B. We have an entry on our blog with the video C. We have a separate section of our site "Video Library" that provides a centralized view of all videos. It's there too. D. We eventually give the video to other sites (bloggers, industry educational sites etc) for outreach and link-building. A through C on our domain are all for user experience as every page is very relevant, but are there any duplicate video issues here? We would likely only have the transcript on the product page (though we're open to suggestions). Any related feedback would be appreciated. We want to make this scalable and done properly from the beginning (will be rolling out 1000+ videos in 2010)
Technical SEO | | SEOPA0 -
Does google use the wayback machine to determine the age of a site?
I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks
Technical SEO | | FastLearner0 -
Google.ca is showing our US site instead of our Canada Site
When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
Technical SEO | | travelocitysearch
Travelocity0 -
Using a third party server to host site elements
Hi guys - I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand. One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL. I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com). I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain? Hope this makes sense (I'm not the best at explaining things!)
Technical SEO | | zealmedia0