Parked former company's url on top of my existing url and that URL is showing in SERPs for my top keywords
-
I have the URL from my former company parked on top of my existing URL. My top keywords are showing up with the old URL attached to the metadsecription of my existing URL. It was supposed to be 301 redirected instead of parked but my web developer insists this was the right way to do it and it will work itself out after google indexes the old URL out of existence. Are there any other options?
-
Thanks, again. Will try these options today. It'll be nice going in more knowledgeable so it's a very good thing you do Mr. Kley.
-
Nothing he can do? Lmao what a terrible answer. On the old site, you should still have Ftp setup. In that account, go into your htaccess file and add the rule that redirects all traffic to your existing domain, or the one you want to get indexed. Also add a robots rule denying any access to the old domain ftp.
Option 2 is to delete any and all old site files in the domain Ftp you want to get rid of, have the site urls return a 404 error, and do a url removal request in webmaster tools. Option 1 would be safer imo, but doing option 2 will get rid of the old domain for good.
-
Thank you both for your responses.
@DavidKley, They do both show up and the developer says there's nothing more he can do since the old site no longer exists. Everything I've read online seems to contradict his though.
The domains in question are:
old - www.aceystowing.com
new - www.jonnystowingnow.com
Any further insight would again be greatly appreciated.
-
Just wanted to add:
Do both urls show up for a page? Meaning if you had a page about dog treats, can that page be accessed through both urls on the Web (manually or in serp results)? If so, you need to redirect the domain you don't want to use immediately to prevent duplications. Just parking one on top of the other usually will not take care of replacing the other url. You don't want to have both indexed at the same time.
-
In addition to parking the domain, did you add a parked domain htaccess rule? In addition to search engines, make sure your visitors are getting to the right place, without duplicate content.
After a while, all the new urls should replace the old ones, but I have seen this process take up to 6-8 months.
-
The definition of words like "parked" can vary in the FAQ documents of one hosting company to another. When I have moved domains I have "parked" them on my hosting and then 301 redirected specific old URLs on the old domain to specific URLs on the new domain.
There are a lot of really competent people out there, but sometimes webdevelpers have a "mechanical knowledge" of how things work but for search engines to treat your domain perfectly something else is required.
If this was my site I would have a technical SEO look at it. I've done this stuff for myself but always paid someone else to review my plan and check to see if it is workin' properly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Spam URL'S in search results
We built a new website for a client. When I do 'site:clientswebsite.com' in Google it shows some of the real, recently submitted pages. But it also shows many pages of spam url results, like this 'clientswebsite.com/gockumamaso/22753.htm' - all of which then go to the sites 404 page. They have page titles and meta descriptions in Chinese or Japanese too. Some of the urls are of real pages, and link to the correct page, despite having the same Chinese page titles and descriptions in the SERPS. When I went to remove all the spammy urls in Search Console (it only allowed me to temporarily hide them), a whole load of new ones popped up in the SERPS after a day or two. The site files itself are all fine, with no errors in the server logs. All the usual stuff...robots.txt, sitemap etc seems ok and the proper pages have all been requested for indexing and are slowly appearing. The spammy ones continue though. What is going on and how can I fix it?
Technical SEO | | Digital-Murph0 -
IT's Hurt My Rank?HELP!!!
hi,guys,john here, i just began use the MOZ service several days ago, recently i noticed one thing that one keyword on the first google search result page, but when i done some external links,the rank down from 1 to 8, i think may be the bad quality external links caused the rank down. so my question,should i delete the bad quality links or build more better quality links? which is better for me. easy to delete the bad links and hard to build high quality links. so what's your better opinion,guys? thanks John
Technical SEO | | smokstore0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
Showing duplicate content when I have canonical url set, why?
Just inspecting my sites report and I see that I have a lot of duplicate content issues, not sure why these two pages here http://www.thecheapplace.com/wholesale-products/Are-you-into-casual-sex-patch http://www.thecheapplace.com/wholesale-products/small-wholesale-patches-1/Are-you-into-casual-sex-patch are showing as duplicate content when both pages have a clearly defined canonical url of http://www.thecheapplace.com/Are-you-into-casual-sex-patch Any answer would be appreciated, thank you
Technical SEO | | erhansimavi0 -
What's best practice for blog meta titles?
I have the option of placing meta titles on the actual blog, or on the blog category on my site. Should I have separate meta titles for each blog or bundle them under a category and try to drive traffic to the category? Can anyone help with best practice?
Technical SEO | | Lubeman0