Hi I know this is cheeky but you are all so helpful on here!
-
hi, quick question, I've made a new instillation of wordpress at sussexchef.com/dev and I'm about to start building pages, obvoisly I'm going to move it to sussexchef.com when its all looking right. when I choose my page address links/ permalinks thingy, should I use new url names that don't already exist on the old site? or should I keep the old url names so I don't get loads of 404's, but include the "dev/" in the url name?
Eg the old address sussexchef.com/home
should I use sussexchef.com/dev/home or sussexchef.com/home-sussex-caterers while building the development site? I'm guessing the later my help out in google searches too?
But if I use Dev in the url shurly I will have to go through almost 100 pages removing the dev/ and also changing all the links too? This would be days of work!
So confused!I'd really appreciate your help here.
Ben
-
Hi,
That should be enough to stop the search engines crawling and indexing the test site.
Remember to take it off when you go live though.
-
Hi Guys and girls.
Thanks for the input, I'm not a web developer or an Seo expert, I'm a Chef, I found all the content on Moz amazing! not only has the using moz helped my web rankings its made My wife and I take on a whole new approach to marketing.
Not to mention all you guys are so happy to help out a novice
As for making the dev site Un-crawlable If ticked "discourage search engines from crawling this site" in the reading settings in wordpress, Is that enough or should I do something else too? I'm focusing all my time on building the now site, Should I run of and learn about robots txt file right away or should it be ok for now?
Thanks for your help!
-
Hi,
If the dev site can't be crawled (which is generally the idea), it doesn't matter what you call the URL's on the test site.
If the URL's on the old site are good, keep the same names for the new one to avoid the need for 301's. If the URL's could be better though, change them but 301 the old pages to the relevant new ones.
Hope that helps
-
Surely if you are building this in dev, the new site isn't craw able anyway so I don't know what the issues are.
Always recommend building new sites and making sure Google or any bot / human cannot see them anyway - as there are usually quite a few errors during build, so wouldn't want someone to accidentally find the dev site and get a bad brand experience,
But on the url side of thing, build it without the dev in the url.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help with best practices on eliminating old thin content blogs.
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
Intermediate & Advanced SEO | | KarenElaine0 -
DO outbound links to manufacture specs, pdfs help or hurt SEO?
I am creating an e-commerce site. All the products have product certification documents/images, PDF docs for instructions, manufacture specs, etc. Should I host all this content or simply link to the original documents and content? What is the best for SEO? Thank you,
Intermediate & Advanced SEO | | Jamesmcd030 -
301 redirection help needed!
Hi all, So if we used to have a domain (let's say olddomain.com) and we had a new site created at newdomain.com how do we properly setup redirects page to page. Caveat, the urls have changed so for instance the old page oldomain.com/service is now newdomain.com/our-services on the new site. Do we need to have hosting on the old site? Do we need to setup individual 301s for each page corresponding to the new page? Just looking for the easiest way to do this CORRECTLY. Thanks, Ricky
Intermediate & Advanced SEO | | RickyShockley3 -
Does anyone know of a Google update in the past few days?
Have seen a fairly substantial drop in Google search console, I'm still looking into it comparing things, but does anyone know if there's been a Google updates within the past few days? Or has anyone else noticed anything? Thanks
Intermediate & Advanced SEO | | seoman100 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Site Migration and Traffic Help!
Hi Moz, I recently migrated my website with the help of an SEO company using 301 redirects. The reason for the move was to change our CMS from .aspx to Drupal/Wordpress. The homepage (www.shiftins.com) and the blog (www.shiftins.com/blog) were the only two pages that kept the same url. Everything else was redirected. It's been about two months since the redirects were completed and traffic has dropped off about 90%. I'm starting to worry that something was not done properly and my traffic may never return. The process for the redirects seem correct when I checked the work the SEO company did. All pages were duplicated, redirected to individual pages, then the old pages were de-indexed. Are there any insights the community can provide? Please help!
Intermediate & Advanced SEO | | shictins1 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
Help, really struggling with fixing mistakes post-Penguin
We had previously implemented a strategy of paying for lots of links and focusing on 3 or 4 keywords as our anchors, which used to REALLY work (I know, I know, bad black hat strategy - I have since learned my lesson). These keywords and others have since plummeted up to 100 spots since Panda 3.3 and Penguin. So I'm trying to go in and fix all our mistakes cuz our domain is too valuable to us just to start over from scratch. Yesterday I literally printed a 75 page document of all of our links according to Open Site Explorer. I have been going in and manually changing anchor text wherever I can, and taking down the very egregious links if possible.This has involved calling and emailing webmasters, digging up old accounts and passwords, and otherwise just trying to diversify our anchor text and remove bad links. I've also gone into our site and edited some internal links (also too weighty on certain keywords) and removed other links entirely. My rankings have gone DOWN more today. A lot. WTF does Google want? Is there something I'm doing wrong? Should we be deleted links from all private networks entirely or just trying to vary the anchor text? Any advice greatly appreciated. Thanks!
Intermediate & Advanced SEO | | LilyRay0