Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
-
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live.
Thanks in advance!
-
But what if I remove the 301 redirect rules in my .htaccess file right(a few minutes) after successfully checking the redirected URLs?
-
If the redirect to the testing site actually is temporary, this is one of the very few instances I would recommend using a 302 redirect. This way, the engines will understand that you aren't making this a permanent move and will, hopefully soon, remove the redirect when the staging site goes live.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
Spam Score & Redirecting Inbound Links
Hi, I recently downloaded a spreadsheet of inbound links to my client sites and am trying to 301 redirect the ones that are formatted incorrectly or just bad links in general (they all link to the site domain, but they used to have differently formatted urls on their old site, or the link URL in general has strange stuff on it). My question is, should I even bother redirecting these links if their spam score is a little high (i.e. 20-40%)? it already links to the existing domain, just with a differently formatted URL. I just want to make sure it goes to a valid URL on the site, but I don't want to redirect to a valid URL if it's going to harm the client's SEO. Also not sure what to do about the links with the --% spam score. I really appreciate any input as I don't have a lot of experience with how to deal with spammy links.
White Hat / Black Hat SEO | | AliMac260 -
Old subdomains - what to do SEO-wise?
Hello, I wanted the community's advice on how to handle old subdomains. We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org. As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these? I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore. Many thanks in advance.
White Hat / Black Hat SEO | | e.wel0 -
Can a Self-Hosted Ping Tool Hurt Your IP?
Confusing title I know, but let me explain. We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates. This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP. Thoughts?
White Hat / Black Hat SEO | | David-Kley0 -
Vanity URLs Canonicalization
Hi, So right now my vanity URLs have a lot more links than my regular homepage. They 301 redirect to the homepage but I'm thinking of canonicalizing the homepage, as well as the mobile page, to the vanity URL. Currently some of my sites have a vanity URL in a SERP and some do not. This is my way of nudging google to list them all as vanity but thought I would get everyone's opinion first. Thanks!
White Hat / Black Hat SEO | | mattdinbrooklyn1 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Is using twiends.com to get twitter followers considered black hatting?
Hi, I've been struggling to get followers on Google Plus and Twitter, and recently stumbled upon twiends.com. It offers an easy service that allows you to get twitter followers very quickly. Is this considered black hating? Even if Google doesn't consider the followers as valid, am I likely to be punished if using their service? Even if it doesn't help rankings, it is nice to have lots of followers so that they will see my tweets which has the potential to drive more traffic to my site, and give awareness to my business. What are your thoughts?
White Hat / Black Hat SEO | | eugenecomputergeeks0 -
DropBox.com High PA & DA?
"What’s up with these dl.dropbox.com High PA & DA links?" You know, It's frustrating to spend almost an entire day getting a few great link backs... then to find out your competitor has hundreds of cheap & easy link backs for the keyword you are going for with greater Authority [according to SEOmoz's OSE]. So I ran a search on one of our top competitors in Open Site Explorer to gather an idea of where the heck they are getting all of their links. Please feel free to copy my actions so you can see what I see. Run a search in OSE for www[dot]webstaurantstore[dot]com. Click on the ‘Anchor Text’ Tab. Click on the first Anchor Text Term, which should be ‘restaurant supplies’ :: Then it will expand, click on the ‘View more links and details in the inbound links section.’ As you scroll down the list you will notice that they have a bunch of linking pages from dl.dropbox.com, all of them are .pdb files, for their targeted Anchor Text, restaurant supplies. Q: So my question is can someone please elaborate on what .pdb files are and how they are getting this to work for them so well? Also you will notice, on the expanded Anchor Text Page, that their 6<sup>th</sup> most powerful link for this phrase (restaurant supplies) seems to be linked straight from a porn site, I thought Google does not rank adult sites like this? Q: For future reference, does anyone know legitimate websites to maybe file an SEO manipulation complaint? Thanks!
White Hat / Black Hat SEO | | Burkett.com0