Moved a site and changed URL structures: Looking for help with pay
-
Hi Gents and Ladies
Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post.
**The background story: **
My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is.
**Here are some of the issues, what we did and a little more history: **
The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's...
Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area
New: http://www.moldinspectiontesting.ca/toronto
My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds).
As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now.
SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other.
The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed).
**On to the questions: **
**What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. **
If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
-
Hello Mathew,
I did a site:domain.com search and do still see some of the old URLs indexed so I checked the URLs using an HTTP header status code checker and they are returning the correct 301 response. I also checked the the rel canonical tag on the new URLs and they do reference themselves, not the old URLs. Therefore I see no reason to be concerned about this issue. It takes time for Google to revisit those old URLs, see the redirect, and update their index. In time the old URLs will drop off and any links going into them should begin counting toward the pagerank of your new URLs.
HOW.Ever...
You have dozens of geotargeted doorway pages that Google probably doesn't like, or that at least violate their guidelines. If there was an office in each location it would be the right thing to do, as you would include the geo-specific address and phone number. Since every page has the same phone number and presumably there is only one office, you are running into the same problem many other "local" businesses have had to deal with over the years. Unfortunately, there still isn't any real solution and you will have real trouble ranking in the local/maps area on Google.
What to do about this is beyond the scope of this question, but if you're going to work with another SEO on this I'd recommend one who has experience with service-oriented business with multiple locations. This page would be a good place to start, and I have pre-filtered it to show only "local search" experts.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site scraped over 400,000 urls
Our business is heavily dependent on SEO traffic from long tail search. We have over 400,000 pieces of content, all of which we found scraped and published by another site based out of Hong Kong (we're in the US). Google has a process for DMCA takedown, but doing so would be beyond tedious for such a large set of urls. The scraped content is outranking us in many searches and we've noticed a drastic decrease in organic traffic, likely from a duplicate content penalty. Has anyone dealt with an issue like this? I can't seem to find much help online.
Technical SEO | | Kibin0 -
New URL Structure
Hi Guy's, For our webshop we're considering a new URL structure because longtail keywords to rank so well. Now we have /category (main focus keywords)
Technical SEO | | Happy-SEO
/product/the-product345897345123/ (nice to rank on, not that much volume) We have over 500 categories and every one of them is placed after our domain. Because i think it's better to work with a good structure and managed a way to make categories and sub-categories. The 500 categories may be the case why not every one of them is ranking so well, so that was also the choice of thinking about a new structure. So the new URL structure will be: /category (main focus keywords)
/category/subcat/ (also main focus keywords) Everything will be redirect (301, good way), so i think there won't be to much problems. I'm thinking about what to do with the /product/ URL. Because now it will be on the same level as the subcategories, and i'm affraid that when it's on that level, Google will give the same value to both of them. My options that i'm considering are: **Old way **
/product/the-product-345897345123/ .html (seen this on big webshops)
/product/the-product-345897345123.html/ Level deeper SKU /product/the-product/345897345123/ What would you suggest? The new structure would be 20 categories 500+ sub's devided under main categories 5000+ products Thanks!0 -
URL structure
Hello Guys, Quick Question regarding URL strucutre One of our client is an hotel chain, thye have a group site www.example.com and each property is located in a subfolder: www.example.com/example-boston.html , www.example.com/example-ny.html etc. My quesion is : where is better to place the language extension at a subfolder level?
Technical SEO | | travelclickseo
Should i go for www.example.com/en/example-ny.html or it is preferable to specify the language after the property name www.example.com/example-ny/en/accommodation.html? Thanks and Regards, Alessio0 -
Mobile site SEO: faulty redirects.. 204! help pls!
Hello, I have recently read that a m.site.com should provide the same number of pages as the non-mobile site. Is this so? Is this why my mobile site won't rank? Our competitors have their standard websites showing up before our m.website.com version... Also, do I have to get a whole new set of mobile links for my mobile site? In WT, I am seeing absolutely no inbound links, which seems odd? Any help would be greatly appreciated. 🙂 Thanks in advance
Technical SEO | | lfrazer0 -
Friendly URLs for MultiLingual Site
Hi, We have a multilingual website with both latin and non-latin characters, We are working on creating a friendly URL structure for the site. For the Latin languages can we use translated version of the URLs within the language folders? For example - www.site/cars www.site/fr/voitures www.site/es/autos
Technical SEO | | theLotter0 -
Help with onpage keyword optimization, site architecture, and how those aspects affect the SERPs.
Hey guys, I've made a post or two before, but my story is that I've been learning SEO for a while now and have only recently (in the last four months) had the opportunity to actually apply what I've been reading about. What I've learned while trying to put these things into practice is that it can be pretty tough sledding, even when it comes to basic elements like keywords and search results. Anyway, to the good stuff. I've been helping my brother's startup company in my spare time because I want them to do well. They're on the last legs of their series A funding and have no money to put towards SEO, content marketing or social, so I'm helping when and where I can for free. The company is Maluuba, a siri-like personal assistant app for Android with a ton of different domains. They launched at TechCrunch Disrupt and actually have a lot of traction and a fair amount of publicity, so I'm not exactly working with scraps, but I don't work with them in their offices and only really communicate with my brother, who is having a really hard time getting buy-in for some of the stuff I want them to do. Their initial website was pretty terrible, so my brother got the okay to redesign the site and together, we worked with a designer to implement the site I linked to. Because they have so many domains (search, social, organization) I thought creating specific pages along with a one homepage would be a good way to optimize for different things and funnel a wider audience to convert to the one macro goal of the site: getting people to download the app. The results haven't been exactly what I expected and I fear I didn't really implement what I still think is a good plan correctly. I've only tried to optimize the pages for a few keywords to start. The main keyword for the homepage and indeed the brand is 'personal assistant app' which is a fairly competitive keyword that I know have them ranking second for on Google CA. I used 'siri-alternative' as a secondary keyword, since that's how they label themselves in the Play Store. For the three other main (pages search, social, organization) I used 'personal assistant app' as a secondary keyword and tried to optimize each page for 'search app', 'social app' and 'organizer app', respectively. While I'm really quite proud that I managed to get a page ranking in the top three for our main keyword, I'm just as disappointed that it's the search page and not the homepage, mainly because I have no idea why it's happening. So, all of that to ask a few questions: Did I make a mistake by trying to add funnels to the site? Or did I just go about optimizing the pages incorrectly? Why does the search page rank really, really well for 'personal assistant app' while the other pages - including the one I intended to rank the highest for that term - lag behind? I'd guess that Google is indexing this page alone as the main representative of 'personal assistant app', but that wasn't my intention. I'm also not using any rel=canonical tags, if that matters. Also, this page has been flipping around in the 1-3 range in the SERPs for about a month, but I still haven't noticed any traffic from 'personal assistant app'. Alright, this is getting way to long. I'd very much appreciate any and all insights as to what I'm doing wrong or what I'm missing. It could be really obvious and thus make this post silly, but I really have read and tried to learn a lot. I just can't see what's going on here because I don't have any experience to compare it to. Thanks in advance for any help. Cheers, JD
Technical SEO | | JDMcNamara1 -
Will changing our colocation affect our site's link juice?
If we change our site's server location to a new IP, will this affect anything involving SEO? The site name and links will not be changing.
Technical SEO | | 9Studios0 -
Has anyone used paid services to help improve their site
Hi, i am getting lots of spam in my mail box about how companies can help you get more traffic and i see on lots of sites about tools that can bring you more traffic and help improve your site, and i am just wondering if anyone has tried any of these services or products to help promote their site. For example, i keep getting sent about submitting my site to over 200 directories or search engines and just wondering if these are a waste of time.
Technical SEO | | ClaireH-1848860