Existing Pages in Google Index and Changing URLs
-
Hi!!
I am launching a newly recoded site this week and had a another noobie question.
The URL structure has changed slightly and I have installed a 301 redirect to take care of that. I am wondering how Google will handle my "old" pages? Will they just fall out of the index? Or does the 301 redirect tell Google to rewrite the URLs in the index?
I am just concerned I may see an "old" page and a "new" page with the same content in the index. Just want to make sure I have covered all my bases.
Thanks!!
Lynn
-
Hi!! Thanks Mike! I didn't realize I was passing the SIDs (as not in the URL) but it makes sense I am. Will take this to a private question and let you know what I hear back.
Thanks for your help!
Lynn
-
I would be happy to help if I knew the answer, but I don't. I don't have session IDs in my URLs (I use cookie-based session management instead, mostly because I wanted clean URLs for bookmarking and SEO). Perhaps someone else who uses session IDs in URLs could answer (or else Google "session IDs in urls" and see what comes up. I found this one: http://www.searchengineguide.com/stoney-degeyter/why-session-ids-and-search-engines-dont.php )
-
Hi! I am in Google Webmaster Tools but haven't played with it extensively since I set it up and added my domain.
Looking at it seeing some crawl errors. Most of them have SID in them. Why would it be trying to crawl a session ID?
That brings up another question. The shopper is able to narrow down a category by manufacturer and price. These links will be crawled and indexed as well. Do I want them to be???
Anything you can offer would be appreciated. If it's too in-depth (meaning will take you too much time) can take this to a private question.
Thank you!
Lynn
-
Hi!! The only thing that has changed is the removal of /shop/ from the product pages URLs. Here is the 301 installed. I was told all was well with it. Would love another set of eyeballs if you can confirm it looks good. I am actually ranking for some things so am paranoid I am going to mess the site move up. Thanks for the info. I really appreciate it.
############################################
enable rewrites
Options +FollowSymLinks
RewriteEngine on
#RedirectMatch 301 ^/shop?/$ http://hiphound.com/
RedirectMatch 301 ^/shop?/$ http://hiphound.com
###########################################
-
Crawl rate depends on your site size, your site's rate of change, how fast you serve pages, and I'm sure a couple of other factors. If you're not yet on Google Webmaster Tools then you should be (it's free). It will show you pages/day that the googlebot is crawling your site.
-
Thank you!! Great article!
Follow-up - how long does it take for the URLs to be rewriten in the Google index? Is that done on the next crawl?
Thanks! I really appreciate the help.
Lynn
-
If you have set up the 301 correctly then if a user tries to visit the old page either via typing the old URL or via the search engine then they will be directed to the new content. When the site is reindexed the old results should fall out of the index.
-
You should be okay with 301s. See http://www.atlantaanalytics.com/practicing-web-analytics/how-does-google-analytics-handle-301-and-302-redirects/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Escort directory page indexing issues
Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
Technical SEO | | ZuricoDrexia
Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?0 -
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Google Not Indexing Submitted Images
Hi Guys! My question isn't too dissimilar to one asked a couple of years ago, regarding Google and image indexing, but having put my web address into a Google image search, I get a return of 15 images, so something isn't right. 5 months ago I submitted our 'new' site to Google webmaster. We have just moved it onto a Shopify platform. They (Shopify) are good at providing places to add titles and Alt tags and likewise we fill them in (so that box ticked!) However I have noticed over the last couple of months that despite 161 images being submitted, only 51 have been indexed. Furthermore and as I said earlier, when you put our site, site:http://www.hartnackandco.com into Google images, it only returns a total of 15 images. Any suggestions and help would be wonderful! Cheers Nick
Technical SEO | | nick_HandCo0 -
Some URLs in the sitemap not indexed
Our company site has hundreds of thousands of pages. Yet no matter how big or small the total page count, I have found that the "URLs Indexed" in GWMT has never matched "URLS in Sitemap". When we were small and now that we have a LOT more pages, there is always a discrepancy of ~10% or so missing from the index. It's difficult to know which pages are not indexed, but I have found some that I can verify are in the Sitemap.xml file but not at all in the index. When I go to GWMT I can "Fetch and Render" missing pages fine - it's not as though it's blocked or inaccessible. Any ideas on why this is? Is this type of discrepancy typical?
Technical SEO | | Mase0 -
Changing the order of items on page against Google Terms & Conditions?
Good day, I am wondering if anybody here has done something like this before. I have a page in one of my sites that contains a number of different - but related - free resources. The resources can be sorted in different ways once the user is on the page. Now I am starting an outreach campaign, and want to be able to send out custom URLS (which pretty much means they have different query strings after them like '?id=123' ) so that when a person clicks on the link to the page it brings up the stuff they are more likely to be interested in at the top. I expect - hope - that some of these people will put links back to this page as a result of this. Now all the links may be slightly different, but they will come to the same page and the content will look slightly different. I will make sure to have the rel=canonical tag in place. Does anybody know if this would be in violation of Google Terms and Conditions. I can't see how, but I wanted to see what the experts here on Moz think before moving forward. Thanks in advance.
Technical SEO | | rayvensoft0 -
URL Changes And Site Map Redirects
We are working on a site redesign which will change/shorten our url structure. The primary domain will remain the same however most of the other urls on the site are getting much simpler. My question is how should this be best handled when it comes to sitemaps because there are massive amounts of URLS that will be redirected to the new shorter URL how should we best handle our sitemaps? Should a new sitemap be submitted right at launch? and the old sitemap removed later. I know that Google does not like having redirects in sitemaps. Has anyone done this on a large scale, 60k URLs or more and have any advice?
Technical SEO | | RMATVMC0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
How can I get a listing of just the URLs that are indexed in Google
I know I can use the site: query to see all the pages I have indexed in Google, but I need a listing of just the URLs. We are doing a site re-platform and I want to make sure every URL in Google has a 301. Is there an easy way to just see the URLs that Google has indexed for a domain?
Technical SEO | | EvergladesDirect0