Assuming you have your canonicals done correctly, the pages will disappear in time.
the pages you wont to de-index, should have a canonical tag that points to the original.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Assuming you have your canonicals done correctly, the pages will disappear in time.
the pages you wont to de-index, should have a canonical tag that points to the original.
Yes you should 301 redirect instead.
but you must have links that point to the index.php url or the crawler would not have found it. fix those links so that they point to "/"
links that go thought a 301 loose a bit of link juice, if you can make your link go directly to the correct url you can save that links juice.
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.index.php\ HTTP/
RewriteRule ^(.)index.php$ /$1 [R=301,L]
I would not de index the page either with robots or WMT.
links in your site that point to any of these pages will now pour their link juice into un indexed pages.
use a canonical tag to fix the problem.
go to admin, then look at filters, its is pretty easy from there
Yes that's correct, or even on the same hosting account
Remember that if you no-index pages, any link you have on your site pointing to those pages is wasting its link juice.
This looks like a job for Canonical tag
you wont find pages on the back end, pages are on the front end,
Can you give us same sample links, thanks
Why build it with Ajax, ajax is good for functionally that needs to load seamlessly but not good for content.
Using escaped fragment seems to not work well as many are having problems getting indexed.
Does your content have to be loaded via ajax? why not load it on the page? would be much simpler
Is this the same website with just a change of domain name?
what sort of web server do you have, iis(Microsoft) or Linux?
Yes.
you don't need both sets of files.
before the request reaches your pages, it is intercepted by the webserver and checks for any 301 rules. if it finds one for that url it will redirect, even if neither of the files exist. this all happens early in the request life cycle.
But as I said before, make things easy for yourself, only 301 the pages that had external links
Remember that a lot of links you had are now no longer, removed or disavowed you cant expect to rank the same.
Also how long has it been since you had penalty lifted. It may take a while for all things to equal up again.
All redirects lose link juice, Gregory is correct in what he says, but you would lose a bit of link juice in the redirect, better to use the anchor, it is ignored
ajax is great for loading content on demand without reloading the page, but for content you want to rank I would not be using it.
For dynamic content you want to rank you should load on initial request to server.
Using solutions like you are just get very messy, and are more work then doing as I suggest
I would find any pages that's have external links, and only 301 them, as there is no use 301'ing pages that have no external links, you can 301 any url you want, the language is not relevant.
What server are you on IIS(Microsoft) or Apache? For Apache then use .htaccess for iis use web.config
Me myself I would not be moving to WordPress, you will end up with many more crawling problems
Best idea is to fix the links so that they point to the correct url, then 301 the unwanted url to the good url.
If you simply use a 301 or a canonical you will be losing link juice on the redirect
How many links do you have, Bing may ignore your page if you have more than 250
If the links are good ones, 301 redirect to a good page, you don't have to have a blank page at that url.
if they are bad links just leave them. if that are 404'ing then they can do you no harm.
The only 404's that can do you harm are ones from your own internal links, because it means you have link juice leaks. fix any if you have them
any links you can get daily, Are probably not worth anything, if you can just get a link on demand then why would google value it?
This would only work if the old site is still live, you need to place the canonical tag in the old page pointing to the new page to transfere link juice.
You can only transfer link juice thought a request (eg: a link)if no links point to old pages, there is no need for the link juice
Having a canonical link pointing to that same url as in the address bar has no affect as far as search engines are concern, the reason moz.com gives for doing this is that if some one scrapes your site, the canonical will point back to the original.
The whole idea of canonical tags and 301's is to do with requests, you want the all requests showing the same content to appear the same page to the search engine.
With normal pages a slash means a different request that without, and to fix it you need to create a 301 that requests again to the correct url. in the process you have lost a bit of link juice.
but when requesting the home page with or without the "/", the request is the same. there is no need to fix it.
press F12 in your browser and test it yourself using the network tab, you can see that entering the url with or without the "/" on the homepage results in the same request.
301's do take resources, but very little, but there has to be a limit, 1000 is a lot.
why do you need so many?
With your old site see what pages had external links, and 301 them, there is no reason to 301 those that do not.
Philip gives good advise, I would add, that if there is no manual penalty, then the algorithm will be dismissing links at best, it would not be penalizing you for them. not for links anyhow
but as Philip said, the domain seems to be parked
If you have a trailing slash, on a url like domain.com/mypage/ then that is a different url to domain.com/mypage
If you fix this with a 301 you lose a bit of link juice in the redirect.
but if you are talking about a homepage url such as domain.com and domain.com/ these are not treated as different urls, there is no redirect between them. there is no problem here, don't worry about it
Matt Cutts recently said that Guest posting is dead. I don't have the link but I am sure you will find story if you look
You will be ok, sites in different tld's are not treated as duplicate content.
does the root domain link to the subdomain?
if so then noindex,follow the pages in the subdomain, this will allow link juice to flow back out of the subdomain.
any links pointing to a nofollow page will waste their link juice
They may already know this.
It is my understanding that pinching someone's content and putting a canonical tag on it does little. if you were to put a canonical tag on your site pointing to theirs then that would be a signal that they are the original content owners.
having a page domain.com with a canonical tag pointing to domain.com does nothing,
having 2 pages domainB.com and domainA.com one with a canonical tag pointing to the other tells the search engine the other is the one to give credit to.
If you don't have access to the other sites, a canonical on your own site is not going to do you much good.
Unless those subdomains for single page sites, may look spammy to google. you can put those pages in your own site, there is nothing to gain using subdomains
I agree with Robert, one website, I would not worry.
Google is looking for a pattern of manipulation.
too many links with same link text is a bad thing, but google is smart enough to realize that one website is not manipulation
Your content needs to be in the html.
You can hide it with display:none and then show it via javascript, but it needs to be on the page.
I would not use such plug ins, I cant see them fixing the problem and will further complicate your site and crawling
WMT can take a long time for anything to update.
but the search engines would have that change by now in there index
Just
why do you have 3 websites with same content?
what are you trying to achieve?
Are they on different tlds, .for example .co.uk .com.au
Yes sounds strange.
Here is a tool that may interest you.
http://www.webworkshop.net/pagerank_calculator.php
Have you don't any link building recently or in the past?
How do you mean navigated by cookies?
I would backup your htaccess file.
then remove half the rules, then test. then you will know what half it is in. keep doing this to you find it
I don't know anything about xcart,but I know that some cms turn off htaccess, such as joomula, there may be a interface to turn it back on.
"everything your last post posited was incorrect."
so you say
"a lot has changed since this article was published in ---> 2007 <--- no-follows weren't even a figment of anyone's imagination back then, let alone a reality"
I wrote that article more recently then 2007, and yes no-follows were around, but are irrelevant, as I don't suggest using them and never have
""Rand and I both tend to believe that it is likely Google has changed and refined the PageRank algorithm many times."
Yes I agree, that's why I said
" test have shown that while google has changed many things, PageRank still works much the same as it did when Google published its algorithm long ago."
The whole idea of PageRank is the amassing of PageRank on pages due to linking as stated here in Wikipedia.
http://en.wikipedia.org/wiki/PageRank
I cant find the bit about spelling anymore, but I did find this
"Meta Keywords (fill them in if you like, keep it short and relevant, but not a big ranking factor)"
Yes I think this is better then having thin content, your links if you are lucky enough to get them would amass on the same page.
You would probably only rank for the static content, at least over time, but that is probably what you want.
I would have access to this page via POST not GET to avoid querystring in the url
Because I have nothing wrong.
I have read the algorithm, I know how PageRank works, my explanation is the same as everyone else that has read it. including SEOMoz
http://moz.com/blog/how-pagerank-works-why-the-original-pr-formula-may-be-flawed
"If you're suggesting that no-following a link "keeps" more link equity on a page, you are incorrect. As I mentioned earlier to Kimberly:"
No I am not see my first reply.
"All followed links on a page pass link equity, and will reduce the link equity on the page the link is on."
all pages pass 85% of there link juice divided between the links on the page and keep 15%, no mater if you have one link or many.
Yes link position on the page does alter the link juice passed, but still 85% of pagerank still flows out though the links.
"how to PageSculpt navigational" links..." nobody does this any more, for so many good reasons. "
I would have to ask what those good reasons are?
internal linking is very important, test have shown that while google has changed many things, PageRank still works much the same as it did when Google published its algorithm long ago.
PageRank does amass on pages, there is no doubt about that.
If you no-index, any link pointing to that page will waste its link juice.
If you must do that no-index,follow so the link juice can flow back out.
if your site is mainly duplicates then you have a problem, but if it is just a few pages, don't worry.
google will give credit to one page and will disregard the others.
As Andy said, you don't need to buy subdomains you need only set them up. but simply pointing a domain or subdomain to your site will not help.
Setting a separate website on the domain, can help, but only if that subdomain has links pointing to it, but then you would be better of getting the links to your primary domain. so I would not bother.
Bing uses Keywords I would keep them,
But don't try to spam them, I think Bing said something like they only use them if they are relevant to your content, adding different spelling and mis-spelt keywords was a recommendation