Domain restructure, sitemaps and indexing
-
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school.
The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it)
To this end, I took categories like:
/body/amazing-big-shoes/
/style/red-boots/
/technology/cyber-boots/And rehoused all the content like so, doing it all manually with ftp:
/boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/I placed 301 redirects in the .htaccess file like so:
redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/
(not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.)
Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so:
<url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url>And resubmitted the sitemap to Google Webmasters.
This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939.
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ?
PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted
-
If your developer will be making the website dynamic via a system like WordPress there will be automated ways to keep your sitemap up to date every time you publish a new page to your system and then it will even ping the search engines that the sitemap is updated It will be a "set it and forget it" type of thing with sitemaps if you are moving in that direction
Good luck!
-
Oh, no what you did is perfect! I guess I meant the site architecture/navigation, but you answered it in your original post when you said "Same content" so disregard that question. Congrats.
-
Sadly I did change the internal linking structure, so that internal links now point to new urls not the old ones. The good news is that even with changing the internal linking structure, Google seems to be keeping abreast of it all. The number of urls indexed has now jumped - in a day - from 939 to 1024, so good old Google is clearly keeping up with the changes. Looks like my fears were ungrounded. Yay
-
Looks perfect to me too. Did the internal linking structure change at all or is that still the same? If it's all the same you should be right back where you were in no time. And you should see some benefits from having a more common sense, easy to understand URL structure. Cheers!
-
That's fair. I get that you're not recommending it personally - but it does seem popular with consistently good feedback from people, so I'll give it a go
-
Just to clarify, I know the sitemap tool I mentioned is very popular. Many small sites use it because it is online, fast and free. I have used it a few times myself. I can't necessarily say I recommend it because I have never personally purchased the software. I would say that if I was looking to obtain a sitemap for your site, I would start with that tool but may take a look at some others.
-
Thanks Ryan, that's a weight off my mind. I'll definitely take up your advice on the sitemap generator, too. Thanks for the recommendation - I'd seen a few around, but wasn't sure - it's great to be pointed in the right direction!
-
Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely?
I would say Yes to all three, but clarify with details below.
When you submit a sitemap to Google with 1500 pages, there is no guarantee they will index all of your pages. It sounds like you have done a lot of intensive, manual work. Fortunately, you have done things the correct way in properly redirecting each page to the new URL. If Google indexed 1400 pages before, they should index around that same number once again. It may take several weeks depending on a few factors such as your site's Domain Authority, navigation and how many links each page has received.
With respect to the sitemap, I would highly recommend using sitemap generation software. It is simply not reasonable to manually update a sitemap with 1500 entries. I would have updated the lastmod date on the sitemap but it may not make any difference.
A popular sitemap tool: http://www.xml-sitemaps.com/. The free version only generates 500 pages, but for $20 you can buy the full version and automate it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and canonical
In my sitemap I have two entries for my page ContactUs.asp ContactUs.asp?Lng=E ContactUs.asp?Lng=F What should I use in my page ContactUS.asp ? Is this correct?
Technical SEO | | CustomPuck0 -
Why is the Dev site indexing and not my actual Domain
hi guys I had 2 word press sites built but since they went live a couple of things aren't working. 1.when I do a keyword search one of the sites comes back with the actual URL and when I do another keyword search for the same site the DEV site comes back indexed and not the actual URL. 2. The other site originally started indexing with the DEV site and not the URL and the developer tried to fix it and now it doesn't index at all. Its been long enough time for it to index. Both URL's are live when put into a browser. Any advice would be great Thanks Jamie
Technical SEO | | HLAS0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Domain vs Sub Domain and Rankings
Hi All Wanting some advice. I have a client which has a number of individual centres that are part of an umbrella organisation. Each individual centre has its own web site and some of these sites have similar (not duplicate content) products and services. Currently the individual centres are sub domains of the umbrella organisation. i.e. Umbrella organisation www.organisation.org.au Individual centres are sub domains i.e. www.centre1.organisation.org.au, www.centre2.organisation.org.au etc. I'm feeling that perhaps this setup might be affecting the rankings of the individual sites because they are sub domains. Would love to hear some thoughts or experience on this and whether its worth going through the process of migrating the individual centre domains. Thanks Ian
Technical SEO | | iragless0 -
Carwling and indexing problems
hi, i have noticed since my site was upgraded that google is taking a long time to publish my articles. before the upgrade google would publish the article straight away, but now it takes an average of around 4 days. the article i am talking about at the moment is here http://www.in2town.co.uk/celebrities-in-the-news/stuart-hall-has-his-prison-sentence-for-sex-crimes-doubled-to-30-months now i have a blog here on blogger and the article was picked up within six mins http://showbizgossipandnews.blogspot.co.uk/2013/07/stuart-hall-has-his-prison-sentence-for.html so i am just wondering what the problem is and what i need to solve this my problem is, my site is mostly a news site so it is no good to me if google is publishing new stories every four days, any help would be great.
Technical SEO | | ClaireH-1848860 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0