Parameter Handling - Nourls Question
-
We're trying to make sense of Google's new parameter handling options and I seem unable to find a good answer to an issue regarding the NoUrl option.
For ex. we have two Urls pointing to the same content:
Ideally, I would want Google to index only the main Url without any parameters, so http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map
To do this, I would set the value No Urls for the zoom, x and y parameters. By doing this do we still get any SEO value from back links that point to the URLs with the parameters, or will Google just ignore them?
-
I think canonicalization is the best option for your case. Make http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map?zoom=2&x=0.518&y=0.3965 canonical tag as http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map. Then google wil show http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map in search results and problem will be solved.
Google will still count backlinks for those url's but if you choose nourl google will not index them again.
If you use canonical tag, all incoming links' juice ,related to that page, collected by main page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle 301 redirects on a business directory
We work with quite a few sites that promote retail traders and feature a traders' directory with pages for each of the shops (around 500 listings in most cases). As retail strips, shops come and go all the time, so I get a lot of pages that are removed as the business is no longer present. Currently I've been doing 301 redirects to the home page of the directory if you try to access a deleted trader page, but this means a ever growing htaccess file with thousands of 301 redirects. Are we handling this the best way or is there a better way to tackle this situation?
Technical SEO | | Assemblo0 -
URL structuring / redirect question
Hi there, I have a URL structuring / redirect question. I have many pages on my site but I set each page up to fall under one of two folders as I serve two unique markets and want each side to be indexed properly. I have SIDE A: www.domain/FOLDER-A.com and SIDE B: www.domain/FOLDER-B. The problem is that I have a page for www.domain.com and www.domain/FOLDER-A/page1.com but I do NOT have a page for www.domain/FOLDER-A. The reason for this is that I've opted to make what would be www.domain/FOLDER-A be www.domain.com and act the primary landing page the site. As a result, there is no page located at www.domain/FOLDER-A. My WordPress template (Divi by Elegant Themes) forced me to create a blank page to be able to build off the FOLDER-A framework. My question is that given I am forced to have this blank page, do I leave it be or create a 302 or 307 redirect to www.domain.com? I fear using a 301 redirect given I may want to utilize this page for content at some point in the future. This isn't the easiest post to follow so please let me know if I need to restate the question. Many thanks in advance!
Technical SEO | | KurtWSEO0 -
Sharing/hosting of content questions...
I just wanted to get opinion on some of the fundamentals and semantics of optimisation and content generation/distribution - your thoughts and opinions are welcome. OK, for example, lets assume (for illustration purposes) that I have a site - www.examplegolfer.com aimed at golfers with golf related content. The keywords I would like to optimise for are: golf balls golf tees lowering your golf handicap drive a golf ball further Now, I'm going to be creating informative, useful content (infographics, articles, how to guides, video demonstrations etc) centred around these topics/keywords, which hopefully our audience/prospects will find useful and bookmark, share and monition our site/brand on the web, increasing (over time) our position of these terms/keywords in the SERP's. Now, once I've researched and created my content piece, where should I place it? Let's assume it's an infographic - should this be hosted on an infographic sharing site (such as Visually) or on my site, or both? If it's hosted or embedded on my site, should this be in a blog or on the page I'm optimising for (and I've generated my keyword around)? For example, if my infographic is around golf balls, should this be embedded on the page www.examplegolfer.com/golf-balls (the page I'm trying to optimise) and if so, and it's also placed elsewhere around the internet (i.e on Visually for example), this could technically be seen as duplicated content as the infographic is on my site and on Visually (for example)? How does everyone else share/distribute/host their created content in various locations whilst avoiding the duplicated content issue? Or have I missed something? Also, how important is it to include my keyword (golf balls) in the pieces' title or anchor text? Or indeed within the piece itself? One final question - should the content by authoured/shared as the brand/company or an individual (spokesperson if you like) on behalf of the company (i.e. John Smith)? I'm all for creating great, interesting, useful content for my audience, however I want to ensure we're getting the most out of it as researching influencers, researching the piece and creating it and distributing it isn't a quick or easy job (as we all know!). Thoughts and comments welcome. Thanks!
Technical SEO | | Carl2870 -
How to handle city-based product selection and duplicate content?
Hi everyone, I've been searching the interwebs for a solution to my problem, but haven't really found anything conclusive. I've got a client with duplicate content issues; their website not only has a nation-wide website, but also 10 different sub-categories for different cities, with each subcategory having the same content as the main website. The reason they wanted city-based websites was due to the changing product offerings in each city. So City 1 may not have all the products available that City 2 does. Needless to say this has caused some duplicate content issues as most sections of the website have been multiplied by 10. When a visitor lands on any page of the website, they are greeted by a pop up asking for their location, which will then redirect them to their selected version of the website. As the copy cannot really be changed enough for each city to make it unique, I've been looking into canonical tags, but this would mean the localised versions will not be indexed by Google. Has anyone had any experience of a similar situation where the product range changes according to location, but it doesn't hurt SEO? Thanks in advance for any advice!
Technical SEO | | Nimbus30000 -
Robots.txt questions...
All, My site is rather complicated, but I will try to break down my question as simply as possible. I have a robots.txt document in the root level of my site to disallow robot access to /_system/, my CMS. This looks like this: # /robots.txt file for http://webcrawler.com/
Technical SEO | | Horizon
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/ I have another robots.txt file in another level down, which is my holiday database - www.mysite.com/holiday-database/ - this is to disallow access to /holiday-database/ControlPanel/, my database CMS. This looks like this: **User-agent: ***
Disallow: /ControlPanel/ Am I correct in thinking that this file must also be in the root level, and not in the /holiday-database/ level? If so, should my new robots.txt file look like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /holiday-database/ControlPanel/ Or, like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /ControlPanel/ Thanks in advance. Matt0 -
Very Quick Joomla Question
Hi, A client's site was previously built in Joomla and he wants us to reproduce content that was in there, but the Joomla site is no longer live and has come to me as an archive containing all the files and folders that were included. So, I am looking at the files and folders without Joomla installed. Can someone tell me quickly how to find the where the actual page content was stored? I started looking, but there are some folders I cannot open and nothing that looks as I expected. Would appreciate a hint or two from someone who knows Joomla well.. Life is too short! Thanks Sha
Technical SEO | | ShaMenz0 -
Duplicate content handling.
Hi all, I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites. You can see an example of the page here: http://tinyurl.com/62wghs5 As you can see the search results are on the right. A majority of these results will also appear on my competitors sites. My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page? Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site. My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
Technical SEO | | Mulith0 -
Weird Indexing Question
Google has indexed mysite.com/ and mysitem.com/\/ (no idea why). If you click on the /%5C? URL it takes you to mysite.com//. I have a rel=canonical tag on it that goes to mysite.com/ but I was wondering if there was another way to correct the issue.
Technical SEO | | BryanPhelps-BigLeapWeb0