How to avoid Sandbox?
-
How to avoid Sandbox?
-
What is Sandbox? In order to avoid something like Sandbox, one should know very well what Sandbox is. But, nobody knows if Sandbox does exist, so let's just focus on the main problem here: How do I get my pages indexed? I have tried over years a lot of techniques, but I found only one that seems to work. If your site is not dynamic, make it so. Create the sitemap and the feed (I recommend RSS 2.0). Put your sitemap in your robots.txt. (last line, like this: Sitemap: http://www.yourdomainname.com/sitemap.xml). Submit sitemap to Sitemaps section in your Webmaster Tools' account. Submit your RSS feed to main RSS directories (just google the words, and you'll find plenty of them). Start with FeedBurner, to please Google. Wait a week or so and you'll see that your pages will start appearing in index. Good luck!
-
Google Sandbox is a debated topic from 2004 and 2005 that has never been confirmed. You shouldn't concern yourself with it too much. Also, the concept of Sandbox would only temporarily penalize new domains for the first few months. If you are worried about being penalized either temporarily or permanantely, there are a couple things you can always do:
1. Create great content
2. Used aged domainsIf you concern yourself with making the best site possible and don't worry about making a quick buck, you shouldn't have a problem.
-
We need a bit more info.
I dont believe there is a sandbox as such.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avoid landing page redirects
Avoid landing page redirects for the following chain of redirected URLs. http://domainname.com/ https://domainname.com/ https://www.domainname.com/ Anyone know how to solve this issue the correct way?
Technical SEO | | Sammyh0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
How to avoid instead suggestion from Google search results ?
Hi, When I search for "Zotey" in google, the following message is being displayed. Showing results for zotye
Technical SEO | | segistics
Search instead for zotey Anyone let me know how to get rid of this conflict asap? Regards, Sivakumar.0 -
Questions about the Sandbox and 301 Redirects
Does the sandbox still exist? What if you have a brand new URL and do a 301 redirect from another website because the name of the service business changed? Thanks for any insight and help.
Technical SEO | | SDSLaw0 -
Static site to wordpress - avoiding 301 redirects
Moving our static website to wordpress, pages currently end in the .htm extension and for reasons of me having to do all the moving myself and wanting to preserve link equity is there any way I can run the pages with a .htm extension in Wordpress? Tried using a plug-in by Daddy Design but it seems a bit hit and miss at times. I basically need to keep the url's the same as I will not be able to get the vast majority of my links altered to the new pages, plus I am doing this by myself!
Technical SEO | | Jon-C0 -
What is the value in Archiving and how can I avoid negative SEO impact?
I have been very busy reducing GWT duplicate content errors on my website, www.heartspm.com, created on a Wordpress platform. Each month, blog entries are being archived and each month is generating a duplicate description by Google. We post 2-3 blog entries per month and they don't really go out of date. Most are not news related butr rather they are nuggets of information on entomology. Do I need to use the archiving feature? Can I turn it off? Should I switch to archive perhaps once per year instead of every month and how is that done? How do I stop Google from creating its' own meta-description, duplicates each month for these archive entries? Should I have the archive as NOINDEX, FOLLOW? I'm not the programmer, but I have some technical know how, so I have a lot of half baked ideas and answers that could use some polishing. Thanks for your help and suggestions. Gerry
Technical SEO | | GerryWeitz0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0