We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
-
-
Deep Crawl is great for large sites
-
I would recommend using deepcrawl.com on your old domain so you can remap / rewrite the old domain and its URLs so if the URLs are rewritten it will help your new website a least it would minimize the damage.
To answer your question correctly yes why not 301 redirect thing you are going to lose any authority your old domain has yes it's bad.
Use archive.org it might have a copy of your entire site structure start form there.
Do you have backups?
-
Unfortunately, we did not do 301 redirects for the entire site and now we don't have the old urls to create the 301 redirects. Is this going to cause serious problems with Google by not having 301 redirects?
-
I agree that keeping the site map is definitely going to lead Googlebot to your site much faster and you should use Fech as a Googlebot on the entire site
Be certain that you have done a page page 301 redirect for the entire site. After that you can look into using this method of removing Data from Google's Index cache
I recommend not removing this unless it is doing damage to your site
https://support.google.com/webmasters/answer/1663691?hl=en
How to remove outdated content
<a class="zippy index1 goog-zippy-header goog-zippy-collapsed" tabindex="0">Remove a page that was already deleted from a site from search results</a><a class="zippy index2 goog-zippy-header goog-zippy-expanded" tabindex="0">Remove an outdated page description or cache</a>
Follow the instructions below if the short description of the page in search results (the snippet) or the cached version of the page is out of date.
- Go to the Remove outdated content page.
-
No problem! Here is a pretty comprehensive list of resources. I personally use ScreamingFrog.
Good luck!
-
Perfect sense. Thank you. Do you know of any good tools that will create an xml site map of at least 19,000 pages?
-
Hi again!
Every page should be on the sitemap so long as it's not behind a login or not supposed to be seen by search engines or users. I would update it and make sure pages aren't noindexed or blocked in your robots.txt. It shouldn't be limited to just your top navigation. Search engines will still crawl and see those deeper pages (not top nav) exist, but uploading them to the sitemap will help expedite the indexing process.
Does that make sense?
-
Thanks for getting back to me. It's the same domain so no change of address needed. We did upload a new site map, but the new site map only has 100 pages on it where the old site map had 19,000. Does the site map need every page on it or just the top navigation pages?
-
Hi Stamats
Did you update your sitemap xml and also submit it to Webmaster Tools? If you changed your domain, you should look into a change of address as well, but only if you changed your domain name.
Keep in mind that it could take Google a little bit to notice these changes, so do your best to help them notice these changes by the steps above.
Hope this helps! Let me know if you need anything else!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Folders or no folders in url?
What's best for SEO: a folder or no folder? For example: https://domain.com/arizona-dentist/somecontent or just https://domain.com/somecontent. The website has 100+ pages with "dentist" within the content of the somecontent pages, as well as specific pages for /arizona-dentist/. Also, the breadcrumb for the somecontent page would appear something like follows: Arizona Dentist > Some Content ... you can find the somecontent page from the Arizona Dentist page. I didn't include folders in the path because I did not want the url to be too long. In terms of where it is showing up on google search results...it is within the top 3-4 on the first page when searching Arizona dentist come content. The website is pretty organized even without subfolders because it was made using Umbraco. I am wondering if using folders will increase the SEO ranking, or if it really doesn't and could hurt it if paths become too long; especially since it's not doing too bad in the search ranking right now. -Thanks in advance for any help.
Algorithm Updates | | bellezze0 -
Google stopped showing time frame for the cached results of the websites; Why? Any alternatives?
Hi Moz community If we Google "site:website.com"; it'll list the cached pages from the website. And we used to check them on required date range like how and which pages got indexed. But date range is not working now and the results are missing the pages which got indexed at the selected date range. Any idea why Google does this? Any alternatives to find the recently indexed pages? Thanks
Algorithm Updates | | vtmoz0 -
Are titles on images still important for SEO?
We're doing research on image optimization and wanted to ask the MOZ community if you think having titles on images are still important for SEO if you have descriptive ALT text.
Algorithm Updates | | EvolveCreative0 -
Studies showing that social sharing does/doesn't affect rankings?
I'm currently researching this area in order to show to a client that social shares aren't as valuable for SEO as they might think. Can anyone point me in the direction of the best studies done on this topic? Thanks in advance!
Algorithm Updates | | QubaSEO0 -
Is this spamming keywords into a url?
My company has previously added on extensions to a url like the example below http://www.test.com/product-name/extra-keywords My question is since there is no difference between the pages http://www.test.com/product-name and http://www.test.com/product-name/extra-keywords and you don't leave the product page to reach the extra-keyword page is this really necessary? I feel like this is probably not a best practice. Thanks for any suggestions.
Algorithm Updates | | Sika220 -
SERP Question - Site showing up for national term over local term recently
Hey Moz, This has been happening to me with a couple of clients recently and I wanted to kick it out to the community and see if anyone else has experienced it and might be able to shed some light on why. (Disclaimer: Both clients are in the elective healthcare space)
Algorithm Updates | | Etna
Scenario: Client's site is optimized for a fairly competitive "procedural keyword + location" phrase. Historically, the site had been ranking on the first page for a while until it suddenly dropped off for that query. At the same time, the page now ranks on the first page for just the procedural term, without the location modifier (obviously much more competitive than with the location modifier). Searches on Google were set to the city in which the client was located. Not that I'm complaining, but this seems a little weird to me. Anyone have a similar situation? If so, any theories about what might have caused it? TL;DR - Site ranked on 1st page for "keyword + location modifier" historically, now ranking on 1st page for "keyword" only and not found with "keyword + location modifier" TRQd9Hu0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Top resulting sites sites for a specific keyword
I'm teaching myself SEO so that I can speak more intelligently to it with my clients. I've spent a great deal of time on seomoz and love it. The more I learn, the more I realize I don't know and that brings me to my current question. I can search on a keyword and see results, however I see every URL available. I'm looking for a simple way to see the root domains for the top 100-500 resulting websites for a specific keyword. Is there an easy way to get this information I'm sure it's right in front of me, but I can't find it. Many thanks, ahossom
Algorithm Updates | | ahossom0