How to avoid Sandbox?
-
How to avoid Sandbox?
-
What is Sandbox? In order to avoid something like Sandbox, one should know very well what Sandbox is. But, nobody knows if Sandbox does exist, so let's just focus on the main problem here: How do I get my pages indexed? I have tried over years a lot of techniques, but I found only one that seems to work. If your site is not dynamic, make it so. Create the sitemap and the feed (I recommend RSS 2.0). Put your sitemap in your robots.txt. (last line, like this: Sitemap: http://www.yourdomainname.com/sitemap.xml). Submit sitemap to Sitemaps section in your Webmaster Tools' account. Submit your RSS feed to main RSS directories (just google the words, and you'll find plenty of them). Start with FeedBurner, to please Google. Wait a week or so and you'll see that your pages will start appearing in index. Good luck!
-
Google Sandbox is a debated topic from 2004 and 2005 that has never been confirmed. You shouldn't concern yourself with it too much. Also, the concept of Sandbox would only temporarily penalize new domains for the first few months. If you are worried about being penalized either temporarily or permanantely, there are a couple things you can always do:
1. Create great content
2. Used aged domainsIf you concern yourself with making the best site possible and don't worry about making a quick buck, you shouldn't have a problem.
-
We need a bit more info.
I dont believe there is a sandbox as such.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
.com & .ie website how to avoid duplicate blog content?
We have 2 websites .com & .ie (both are more or less identical except 2 different markets). How can I avoid duplicate blog content as lots of our .com/blog and .ie/blog is the same? Maybe.... Our main .com blog articles are searchable then on our .ie blog content non searchable? (This way both markets get to view the content but only Google actually searches our .com blog) Alliteratively I would need to rewrite each article so that is unique Advise would be appreciated, thank you.
Technical SEO | | AdvanceSystems0 -
How unique does a page need to be to avoid "duplicate content" issues?
We sell products that can be very similar to one another. Product Example: Power Drill A and Power Drill A1 With these two hypothetical products, the only real difference from the two pages would be a slight change in the URL and a slight modification in the H1/Title tag. Are these 2 slight modifications significant enough to avoid a "duplicate content" flagging? Please advise, and thanks in advance!
Technical SEO | | WhiteCap0 -
Avoiding Duplicate Content in E-Commerce Product Search/Sorting Results
How do you handle sorting on ecommerce sites? Does it look something like this? For Example: example.com/inventory.php example.com/inventory.php?category=used example.com/inventory.php?category=used&price=high example.com/inventory.php?category=used&location=seattle If not, how would you handle this? If so, would you just include a no-index tag on all sorted pages to avoid duplicate content issues? Also, how does pagination play into this? Would it be something like this? For Example: example.com/inventory.php?category=used&price=high__ example.com/inventory.php?category=used&price=high&page=2 example.com/inventory.php?category=used&price=high&page=3 If not, how would you handle this? If so, would you still include a no-index tag? Would you include a rel=next/prev tag on these pages in addition to or instead of the no-index tag? I hope this makes sense. Let me know if you need me to clarify any of this. Thanks in advance for your help!
Technical SEO | | AlexanderAvery1 -
Using video transcripts v captions and avoiding duplicate content?
Part 1: After editing a You Tube transcript, I typically re-upload as a caption file (with time codes)...for SEO does it matter whether you upload as a transcript v. captions? Is one better than the other? Part 2: If you upload a transcript (or caption) to YouTube, then post that video/transcript in your blog, wouldn't you get pinged for duplicate content?
Technical SEO | | vernonmack0 -
Avoiding duplicate content with national e-commerce products and localized vendors
Hello 'mozzers! For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here! CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages. With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs. CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs". The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues. Questions 1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct? 2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty? 3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
Technical SEO | | CatalystSEM0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Is the Sandbox Real? Need Help!
To start, I'm very new at this so I've likely made a ton of mistakes but here is the breakdown of what's happened/what's been done to my site. I own a wedding photography company which was based in Portland, we decided about six months prior that we wanted to relocate to San Diego. It was too soon to optimize our website for our new town of San Diego so I created a brand new site. It was born around June 2011. It looks just like the old site but all the content is different (different titles, re-uploaded images, text, etc was optimized for San Diego). What may be my pitfall is I imported our blog posts from the old site to the new site and we continued to keep both blogs live (writing the post in one, importing to the other). San Diego site: http://continuumweddings.com Old Site (now optimized for LA): http://continuumphotography.com From there I began link building. I signed up for the SEO Scheduler and began making the changes suggested there. It told me to sign up for Linxboss, and I did it. Other than that, my links have been build naturally and I have quite a few of them, definitely enough to compete with my top competitors. At one point I was #3 for "San Diego Wedding Photographer" and I stayed there for a couple weeks. Then I began to drop. Now I'm somewhere on page 10. I've read a lot of articles on here and I know I have a lot of things potentially hurting me. Site age, Duplicate content, etc. I'm just not sure why I dropped (still rank on 1st page in Yahoo & Bing) and what I should do about it. I tend to get overwhelmed and every post I read seems to talk about something new I may have done wrong. I'm willing to put in the time to fix this; I just need to know where my time is best spent.
Technical SEO | | mrsmelmitch0