http://www.entireweb.com/express_inclusion/
Anybody used this? There is a express inclusion that you pay for, vs the free. Wanted to see what the group thinks. Worth it, not? It is not included in the SEOMoz list of directories.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
http://www.entireweb.com/express_inclusion/
Anybody used this? There is a express inclusion that you pay for, vs the free. Wanted to see what the group thinks. Worth it, not? It is not included in the SEOMoz list of directories.
Ditto on what Donnie said. Purple Cow, if you want that site to be an authority, it needs to be authoritative. Why would anyone buy the Washington Post if it just copied all its articles from the New York times? Get a few staff writers to combine and tweak articles as Donnie mentioned or to write original content.
Good luck!
Generally speaking, if you transition it correctly, have the exact same site up and running on the new IP before you change the DNS you should be fine. I did some Googling on the subject, and Mark D. has a much more specific and detailed description of what you should do as far as making sure you have the exact same site running
http://malteseo.com/seo/changing-ip-address-without-losing-google-ranking/
What you do not want to do at this point is change up your URL structure, title tags etc. Those changes alone can impact your rankings and you don't want to compound the issues. Less change, more gradual change is always better.
Agreed! Like with anything the answer is, "Well, it depends."
I have to digress on 301s and then bring up soft 404s
You have to watch about sending too many 301s to a single page. Sometimes you have to do this, but I have also seen Google showing soft 404 errors when you have a bunch of many paged 301ing to a single page on sites we manage.
There is a subtle thing we have found on how Google thinks about 301s as it relates to 404s
http://googlewebmastercentral.blogspot.com/2010/06/crawl-errors-now-reports-soft-404s.html
They state that to correct soft 404s one thing you should look at is 2.b "Should redirect to a more accurate URL"
Google prefers that a 301 is more of a one-to-one or several to one.
There was also a post in SEOmoz on this
http://www.seomoz.org/blog/301-redirect-or-relcanonical-which-one-should-you-use
If you send too many 301s to a single page - like the home page, this may look like you are trying to manipulate link juice.
I would be so bold as to say Google prefers a one to one relationship for 301s vs a many to one.
Options
After 20 days, show a 200 on the page, update the message to say that this product is no longer available with a link to your search page but then add the noindex meta tag to that page. This will allow Google to spider, but remove the page from the index. You would also need to remove all links on your site to this page after 20 days as well.
Leave up the page for 6 months - 1 year and then setup a 404 as the page is dead and out of the Google index and the 20 day deal has been long gone.
This will get those pages out of the index, tell users where they need to go in case they land on the page and minimize any 404 or soft 404 errors in Google webmaster tools.
I am assuming that since these pages are only up for 20 days, they do not have time to really gain any search traction to start with and so would not show up in the SERPs (or rank that high if they did).
If that is the case, why not put all these pages in a separate folder that you block with robots.txt and then no follow all links to them. Keep them out of the index to start with.
Sounds like you are optimizing a category type page above the product pages anyway and so you just focus on optimizing the category page vs the product pages themselves.
Beyond that, would need more specifics on the how and the why of what you are doing to try and figure out the if and the when of next steps
One of our sites it took about 6 months before things got back to "normal" in the SERPs. As far as how long to leave the 301s. I would say indefinitely. There are links from other sites with old URLs that we still get traffic from and want to make sure we get that referral traffic.
Having the old 301s in place I do not think hurts anything as over time they will be used less and less. You could also look at your server logs and determine that some 301s are not used anymore and then turn them off.
It just always surprises me when I see Google looking for old pages that we have 301ed for over a year and so I leave them in place.
Good luck.
I would agree with all the comments on how to technically deal with the random pages, but it is a losing battle until you get your website database/templates under control. I once had a similar issue and had to work months to get a solution in place as the website would create all kinds of issues like this.
We had to implement a system so that the creation of these pages would be minimized. I think the issue is that you need to make sure that any random page requests, make sure they get a 404 to start with so that the URL does not get indexed to start with.
That said, all the random URLs that are already indexed, I like the 200 options with the noindex meta tag. My reasons: This is because otherwise with the 404s you get all these error messages that are meaningless in GWT. The noindex also gets the page out of the index. I have seen Google retry 404s on one of our sites, crazy. Ever since Google started showing soft 404s for 301s that redirect many pages to a single URL, I only try to use 301s on more of a one to one basis.
Good luck.
Yes this is a big problem. You are showing Google, 2 pages with the same content. This would be the case if this were any two pages with the same content on your site.
You definitely need to setup a 301 redirect (and not a 302) the slug to the main page.
You have to be careful with CMS systems. They may be generating a ton of duplicate pages and you are not even aware of it. Check with you IT person, but then look in Google Webmaster Tools and also use other tools to check (e.g spidering tools).
You want each URL on your site to have a unique title tag, description and page content on your site.
Good luck!
Me too. It was that video that helped to clear things up for me. Then I could see when to use robots.txt vs the noindex meta tag. It has made a big difference in how I manage sites that have large amounts of content that can be sorted in a huge number of ways.
Take a look at
http://www.youtube.com/watch?v=KBdEwpRQRD0
to see what I am talking about.
Robots.txt does prevent crawling according to Matt Cutts.
Just a couple of under the hood things to check.
Are you sure your robots.txt is setup correctly. Check in GWT to see that Google is reading it.
This may be a timing issue. Errors take 30-60 days to drop out (as what I have seen) so did they show soft 404 and then you added them to robots.txt?
If that was the case, this may be a sequence issue. If Google finds a soft 404 (or some other error) then it comes back to spider and is not able to crawl the page due to robots.txt - it does not know what the current status of the page is so it may just leave the last status that it found.
I tend to see soft 404 for pages that you have a 301 redirect on where you have a many to one association. In other words, you have a bunch of pages that are 301ing to a single page. You may want to consider changing where some of the 301s redirect so that they going to a specific page vs an index page.
If you have a page in robots.txt - you do not want them in Google, here is what I would do. Show a 200 on that page but then put in the meta tags a noindex nofollow.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
"When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it"
Let Google spider it so that it can see the 200 code - you get rid of the soft 404 errors. Then toss in the noindex nofollow meta tags to have the page removed from the Google index. It sounds backwards that you have to let Google spider to get it to remove stuff, but it works it you walk through the logic.
Good luck!