Session IDs and crawlers
-
Hello here.
When we setup our e-commerce website virtualsheetmusic.com to allow session IDs to be assigned to users back in 2001, we decided to not assign them if a bot called the page. We wanted to be sure that bots, which officially can't store cookies, wouldn't have found links containing every time different session IDs . Just to better clarify, the way session IDs are generated on our system, is the standard way: if users have cookies enabled, a cookie called PHPSESSID is created which stores the session cookie. If the cookies are not enabled, session IDs are added automatically by the system to any link URL included on the page which could potentially cause the bots to find every time different link URLs with the session ID appended to them.
Now, after 12 years, we are considering if this is still a valuable solution, or can it be detrimental or negative in some way? What are your thoughts about this issue?
Thank you in advance for any thoughts.
Fab.
-
Thank you Kurt, that's exactly what I thought but I wanted to have confirmation from the experts community.
Thank you again!
-
You're not giving the search engines different content, so it's not deceptive. I can't think of any way it would harm you.
-
Thank you guys for your replies and insights, I am more for keeping what we have lived with so far, which means leaving the system on our site disabling session IDs when bots request the pages, unless you tell me there is any downsides to do that... that's really what I am trying to find out here. Is there any downside to not serve pages with session IDs to search engines compared to users?
Thank you again.
-
I can't speak to the technical side of setting up session IDs, however, you can deal with the URL issue with canonical tags and setting up URL parameters in Google and Bing Webmaster Tools. That should prevent the search engines from indexing every URL with a different session id and keep all the page authority on the main URL.
Kurt Steinbrueck
OurChurch.Com -
Hi Fabrizo,
Nowadays I would say there are better solutions to fix this issue. But I'm really not sure if you could convince me for rebuilding this feature on the site as the impact for SEO would probably be not really big. I think the best way to not set any Session IDs in the URL at all so you have plain URLs. What you then could do is use these pages as the basis of your URL structure and SEO strategy. You could also then canonicalize the session ID'ed urls back to the plain ones.
Hope this helps! Makes sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO: High intent organic revenue down in Europe
Our team is stumped and we are hoping some of you might have some insight! We are seeing a drop in Europe organic revenue and we can't seem to figure out what the core cause of the problem is. What's interesting, the high intent traffic is increasing across the business, as is organic-attributed revenue. And in Europe specifically, other channels appear to be doing just fine. This seems to be a Europe high-intent SEO problem. What we have established: Revenue was at a peak in Q4 2017 and Q1 2018 Revenue dips in mid-late Q2 2018 and again in Q4 2018 where it has stayed low since Organic traffic has gone up, conversion rate has gone down, purchases have gone down Paid search traffic has gone up, conversion rate has gone down slightly, submissions have gone up Currency changes are minimal We cannot find any site load issues What we know happened during this time frame (January 2018 onward): Updates to the website (homepage layout, some text changes) end of April 2018 GDPR end of May 2018 Google Analytics stops being able to track Firefox Europe is a key market for us and we cant figure out what might be causing this to happen - again, only in Europe - beyond GDPR and the changes we've made on our site is there anything else major that we're missing that could be causing this? Or does anyone have any insights as to where we should look? Thank you in advance!
Algorithm Updates | | RS-Marketing0 -
International Homepage Advice
Hello, colleagues! We have a conundrum. A client website has a good subdirectory strategy for localized/translated content for its various international markets, but nothing currently "lives" at the root. In my mind, this presents a challenge to search engines (note that we have had some trouble getting proper visibility overall, which is why I'm asking this question). I'm looking for any links or just plain old good advice on why it's important to have a global homepage. Should that global homepage be in English? Most enterprise sites I've worked with do have a homepage that's in English, with the ability to select a country from a drop down in a nav across the site. Any advice, best practices, etc. about why a global homepage is important and what language it could/should be in would be really helpful. Hreflang tags would make sense, I guess, but each country has slightly different offerings so I'm not sure that it makes complete sense. In other words, one country's homepage may have completely different content than another's. Thank you!
Algorithm Updates | | SimpleSearch0 -
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Is anyone else's ranking jumping?
Rankings have been jumping across 3 of our websites since about 24 October. Is anyone seeing similar? For example ... jumps from position 5 to 20 on one day, then back to 5 for 3 days and then back to 20 for a day I'm trying to figure out if it's algorithm based or if my rank checker has gone mad. I can't replicate the same results if I search incognito or in a new browser, everything always looks stable in the SERPs if I do the search myself
Algorithm Updates | | Marketing_Today0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0