Is there a problems with putting encoding into the subdomain of a URL?
-
We are looking at changing our URL structure for tracking various affiliates from:
https://sub.domain.com/quote/?affiliate_id=xxx
to https://aff_xxx_affname.domain.com/quote/
Both would allow us to track affiliates, but the second would allow us to use cookies to track. Does anyone know if this could possibly cause SEO concerns?
Also, For the site we want to rank for, we will use a reverse proxy to change the URL from https://aff_xxx.maindomain.com/quote/ to https://www.maindomain.com/quote/
would that cause any SEO issues.
Thank you.
-
Hello Rox,
I hate the idea of you going away unsatisfied with unanswered questions. Let's try to work through this. Let me approach it from a different way, as I may have misunderstood what you were asking.
https://sub.domain.com/quote/?affiliate_id=xxx
https://aff_xxx_affname.domain.com/quote/
The first URL is the one I'd go with because it's easy to rel canonical back to the base URL and you're keeping it all on one subdomain. The second version creates a new subdomain for every affiliate, which I don't think would be a good thing.
Please let me know if I have understood your question this time.
Thanks!
-
I don't feel this question is understood and I'm not sure how to explain it any further so I would like to close it, without marking any answers good or bad. I'm not sure how I would close it so am posting this response. Thank you.
-
Not if that something has to do with affiliate landing pages.
-
Isn't it good to let google see that our site does something as opposed to just being a glorified blog?
-
Hi RoxBrock! Did Everett and Dmitrii answer your question? If so, make sure to mark one or both of their responses "Good Answers."
-
All of those subdomains should have a robots.txt file that disallows the search engine from accessing them in the first place. In that sense, it doesn't matter which you choose for SEO. Or am I missing something?
-
The sub-domain would not be redirected. It would be forwarded on to an application server to parse the sub-domains and continue with displaying a page customized for that affiliate (HTML, CSS, and verbiage all can change).
As for why the URL structure won't work, the site must throw an error if no affiliate ID is specified. It cannot simply assume the one in the cookie is correct.
-
Hi there.
So, are those affiliates domains gonna actually exist and be empty and be redirected from with proxy or it's gonna be simple htaccess rewrite rule?
In first case i can see lots of trouble with UX (do you like when after clicking on a link it takes you to a page, which refreshes several times before giving you end page result? - feels very spammy and unsettling, doesn't it?!)
In second case - there won't be any problems.
P.S. Why parameter URL structure doesn't let you use cookies? You can assign specific cookie based on parameter with simple php get. Or am i missing something?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Image URL Change Catastrophe
We have a site with over 3mm pages indexed, and an XML sitemap with over 12mm images (312k indexed at peak). Last week our traffic dropped off a cliff. The only major change we made to the site in that time period was adding a DNS record for all of our images that moved them from a SoftLayer Object Storage domain to a subdomain of our site. The old URLs still work, but we changed all the links from across our site to the new subdomain. The big mistake we made was that we didn't update our XML sitemap to the new URLs until almost a week after the switch (totally forgot that they were served from a process with a different config file). We believe this was the cause of the issue because: The pages that dropped in traffic were the ones where the images moved, while other pages stayed more or less the same. We have some sections of our property where the images are, and have always been, hosted by Amazon and their rankings didn't crater. Same with pages that do not have images in the XML sitemap (like list pages). There wasn't a change in geographic breakdown of our traffic, which we looked at because the timing was around the same time as Pigeon. There were no warnings or messages in Webmaster Tools, to indicate a manual action around something unrelated. The number of images indexed in our sitemap according Webmaster Tools dropped from 312k to 10k over the past week. The gap between the change and the drop was 5 days. It takes Google >10 to crawl our entire site, so the timing seems plausible. Of course, it could be something totally unrelated and just coincidence, but we can't come up with any other plausible theory that makes sense given the timing and pages affected. The XML sitemap was updated last Thursday, and we resubmitted it to Google, but still no real change. Anyone had a similar experience? Any way to expedite the climb back to normal traffic levels? Screen%20Shot%202014-07-29%20at%203.38.34%20PM.png
Intermediate & Advanced SEO | | wantering0 -
How to properly 404 pages from a subdomain
SO I am working on a site that had a subdomain that attracted a lot of spammy links. I researched the backlinks to this subdomain, and there were no beneficial links at all. I am thinking the best thing is to 404 this subdomain. What is the best way to do this? Should I just edit the DNS settings so that this subdomain does not point to the root domain? Or is there something that should be done in webmaster tools? Thanks in advance!
Intermediate & Advanced SEO | | evan890 -
Complex URL Migration
Hi There, I have three separate questions which are all related. Some brief back ground. My client has an adventure tourism company that takes predominantly North American customers on adventure tours to three separate destinations: New Zealand, South America and the Himalayas. They previously had these sites on their own URL's. These URL's had the destination in the URL (eg: sitenewzealand.com). 2 of the three URL's had good age and lots of incoming links. This time last year a new web company was bought in and convinced them to pull all three sites onto a single domain and to put the sites under sub folders (eg: site.com/new-zealand). The built a brand new site for them on a Joomla platform. Unfortunately the new sites have not performed and halved the previous call to action rates. Organic traffic was not adversely affected with this change, however it hasn't grown either. I have been overhauling these new sites with a project team and we have managed to keep the new design but make usability/marketing changes that have the conversion rate nearly back to where it originally was and we have managed to keep the new design (and the CMS) in place. We have recently made programmatic changes to the joomla system to push the separate destination sites back onto their original URL's. My first question is around whether technically this was a good idea. Question 1 Does our logic below add up or is it flawed logic? The reasons we decided to migrate the sites back onto their old URL's were: We have assumed that with the majority of searches containing the actual destination (eg: "New Zealand") that all other things being equal it is likely to attract a higher click through rate on the domain www.sitenewzealand.com than for www.site.com/new-zealand. Having the "newzealand" in the actual URL would provide a rankings boost for target keyword phrases containing "new zealand" in them. We also wanted to create the consumer perception that we are specialists in each of the destinations which we service rather than having a single site which positions us as a "multi-destination" global travel company. Two of the old sites had solid incoming links and there has been very little new links acquired for the domain used for the past 12 months. It was also assumed that with the sites on their own domains that the theme for each site would be completely destination specific rather than having the single site with multiple destinations on it diluting this destination theme relevance. It is assumed that this would also help us to rank better for the destination specific search phrases (which account for 95% of all target keyword phrases). The downsides of this approach were that we were splitting out content onto three sites instead of one with a presumed associated drop in authority overall. The other major one was the actual disruption that a relatively complex domain migration could cause. Opinions on the logic we adopted for deciding to split these domains out would be highly appreciated. Question 2 We migrated the folder based destination specific sites back onto their old domains at the start of March. We were careful to thoroughly prepare the htaccess file to ensure we covered off all the new redirects needed and to directly redirect the old redirects to the new pages. The structure of each site and the content remained the same across the destination specific folders (eg: site.com/new-zealand/hiking became sitenewzealand.com/hiking). To achieve this splitting out of sites and the ability to keep the single instance of Joomla we wrote custom code to dynamically rewrite the URL's. This worked as designed. Unfortunately however, Joomla had a component which was dynamically creating the google site maps and as this had not had any code changes it got all confused and started feeding up a heap of URL's which never previously existed. This resulted in each site having 1000 - 2000 404's. It took us three weeks to work this out and to put a fix into place. This has now been done and we are down to zero 404's for each site in GWT and we have proper google site maps submitted (all done 3 days ago). In the meantime our organic rankings and traffic began to decline after around 5 days (after the migration) and after 10 days had dropped down to around 300 daily visitors from around 700 daily visitors. It has remained at that level for the past 2 weeks with no sign of any recovery. Now that we have fixed the 404's and have accurate site maps into google, how long do you think it will take to start to see an upwards trend again and how long it is likely to take to get to similar levels of organic traffic compared to pre-migration levels? (if at all). Question 3 The owner of the company is understandably nervous about the overall situation. He is wishing right now that we had never made the migration. If we decided to roll back to what we previously had are we likely to cause further recovery delays and would it come back to what we previously had in a reasonably quick time frame? A huge thanks to everyone for reading what is quite a technical and lengthy post and a big thank you in advance for any answers. Kind Regards
Intermediate & Advanced SEO | | activenz
Conrad0 -
Canonical URL Tag
I have 3 websites with same content, I want to add Canonical tag to my main website. Is this also important to mentioned other duplicate URL in canonical tag in main website? or just need to just add
Intermediate & Advanced SEO | | marknorman0 -
How to Find problem domain history
Hi I have what most of you may think is a dumb question but here goes. please be nice... 🙂 So I have a client (http://www,ace-alarms.co.uk) who are having a real problem ranking for ANY of their key words. I know it's a reasonably competitive area but I've not seen such a stubborn domain and it seems that no matter what we do there's nothing listed. i'm thinking that there may be a problem with the domain name. My question is; how can I find out if this is a problem domain. Thanks in advance Steve
Intermediate & Advanced SEO | | stevecounsell0 -
Is My Competitor Beating Me With A Better URL Structure?
A competitor is consistently beating my website on non-competitive, long tail keywords. His DA is 32 compared to my 46. His average PA is 23 to my 28. His average On Page Optimization Grade is a C compared to my A. His page speed score using YSlow is a 71 compared to my 78. The only thing I can think of at this point is that he has a better URL structure. We both have the keyword in the URL, but his structure goes like this (keyword: apw wyott parts): www.competitor.com/apw-wyott/parts While mine goes like this (I had nothing to do with this site's architecture; this is what I'm stuck with for the time being): http://www.etundra.com/APW_Wyott_Parts-C347.html It should be noted that the last word in these keywords is always the same - "parts." These keywords are for parts by different manufacturers so they follow a consistent pattern: [manufacturer-name] followed by "parts." Also, the "C347" on the end of my URL is the category number given to this particular category of products in our database. Are his URLs beating me or should I continue to look for other factors? If so, what other factors should I consider?
Intermediate & Advanced SEO | | eTundra0