they will be linked to by internal links,
There is no penalty for have duplicates of your own content, but having links pouring away link juice is a self imposed penalty.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
they will be linked to by internal links,
There is no penalty for have duplicates of your own content, but having links pouring away link juice is a self imposed penalty.
The problem with robots text is that any link pointing to a no-indexed page is passing link juice that will never be returned, it is wasted. robots.txt is the last resort, IMO its should never be used.
The thing is, his site has no penalty on the site, only on the bad links, removing the bad links that are already being ignored will do nothing
"we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole"
I'm not sure you have a problem, why not let them all get indexed?
you need to explain that to them, and outline your efforts
none, but don't expect much.
some good news and some bad news.
Links got long ago can come back and bite you any day.
The fact that google said that they have not penalized your site, but rather just the links, is not all good news. it means that they have ignored the links, that why your site has dropped, removing them will not lead to a rise in rank, it means that where you rank now isa where you will rank without those links, removed or ignored.
At least that is how I understand it. I would still remove them in case I am wrong
do you have any actions against you in WMT?
How did you get 100k likes on Facebook? That's a lot, did you get them legit?
I don't know what slide share is, but if it is content on another site, then I would get it on your own site.
I would look at canonical and rel previous next,
Also I would establish do you have a problem?
duplicate content is not always a problem, if it is duplicate content on your own site then there is not a lot to worry about, google will rank just one page. There is no penalty for DC itself, if you are screen scaping then you may have a problem,
People do it to stop scarpers, but if your going to write screen scraper it would not be hard to remove canonical tags as well. so I don't think much of the idea.
Bing recommends that you do not use self ref canonicals tags. It could be that a self ref canonical tag may be followed as is alluded to by Bing, meaning that lose a bit of link juice thought the redirect.
There is no ideal length, its more like writing a play, its not the length of the play that makes it a success, its how well you portray the story, how you set the stage how you identify the charatures
robotx.txt is a bad way to do things, because any links pointing to a noindexed page wastes its link juice. using noindex,follow is a better way as it allows the links to be followed and link juice to return to your indexed pages.
but best not to noindex at all, and find another solution if posible
Because google will drop that url and crawl the new one by itself.
If all you are doing is changing domain then one 301 will do all anyhow.
but if you are 301'ing page by page, then just do the ones that have external links.
Just that a count of internal links. I don't think it is the most helpful stat.
Why it is not reporting all your links I don't know, but I can crawl your site for you using 6the Bing API and can tell you if you have a problem that's is blocking the links
I don't understand if you look at source code and see the text url, then that is how you are linking.
Using text rather then a ID gives you a chance to insert a keyword so text is better then ID.
Linking to an ID then redirecting to a text url as many CMS do, loses link juice on the redirect.
If mywebsite.com is 301'ed to some other page, then mywebsite.com will no longer be reached buy requests, so the content will never be read.
The content when talking about a 301is irrelevant, all a 301 does is redirect the request. when google follows a link to your old page it is redirected to the new page and the link juice will now fall on the new page. The content on the new page is read instead,
Many times when people say , my competitor has a spammy link profile, it turns out that among all the worthless links are a few really good ones.
If you see they are doing anything wrong report them to google, https://support.google.com/webmasters/answer/35265?hl=en
Having 2 or more domains will not help at all, it does not work link that. you would have to 301 redirect them to a single domain and that single domain is the only one that will help.
You could raise suspicion, but some sites go viral and build links naturally very quickly.
what I would suggest is that if you are building links quickly, they are probably crappy links. If you can get a link easily its probably worthless.
I don't believe there is a such a thing as google sandbox.
Spot on, unless your site is trying to spam results with DC, you don't have a problem,
And yes it would help google understand what you are doing.
google will find them unless you have hidden them behind no-index, but the number of internal links is not important, what is important is your linking structure.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
I would not no-index them, as links pointing to no-indexed pages waste link juice. I don't think you have a problem. The duplicate or thin content pages may not rank well, but they will not have a negative affect on your home page.
http://www.searchenginejournal.com/matt-cutts-explains-duplicate-content-affects-rankings/82327/
As long as your link text is relevant to the pages you should be fine.
But footer links are discounted buy search engines. and mostly missed by users. If you want those pages to rank I would be looking at putting the links further up the page.
Also linking to as many pages from the homepage as possible is a good thing, here is a good read about internal linking
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
You can 301 any page to anywhere, that not a problem, but before you waste any time, only 301 the pages that have external links, there is no value in 301'ing pages unless they do.
You are spot on, all you are doing to hiding and showing, if you can see it with fetch as googlebot you have no problems.
I differ in thinking here on one point.
I believe that links pointing to the same page have the same decay as any other link. That page rank flows out of them just the same, obviously it goes to the same page but some of it is lost every time.
What I do and looks much better is use javascript to scroll to that spot,
$('html, body').animate({ scrollTop: $("#mytarget").offset().top - 70 }, 'slow');
This will scroll to your anchor, the offset lets you position things nice.Even if I am wrong, the scrolling animation looks and feels much better
I can see it fixed your problem, but its a ugly fix, you mean need to use parameters in the future, you may already be using them but unaware.
If the site is returning 200's then that is where the problem lies, you need to find out why.
I can see any other fix, removing the urls is only a temp fix, you must make them return 404's
Yes.
The idea of having a canonical is to point it to another page, many just don't get this
That means no body clicks on them, but how did google find them? This is not evidence there is no links, just that no one has visited your site thought them
Yes you can, but how well multiple locations works, I don't know.
Its hard to get your homepage to rank in one location, let alone landing pages ranking in every location. Worth the try, but hard to do.
This is normal, try some other sites you will see that same pattern.
Bing stated they don't want every page, they don't want junk, they want only the important pages.
These pages may have links from other spam sites, you don't want them to return a 200.
You want them to 404, in joomla you can make the site use htaccess or not, make sure it dose and 404 the pages there.
Links send link juice to other pages, this is the same for internal links. It is not so much the number of internal links as the pattern of internal links that's is important. Using your internal links you can push page rank on to certain pages (Usually the home page). Lets say that you homepage ends up with twice as much PageRank as your other pages, then as your site gets more external links the incoming link juice will follow that existing pattern.
Reading this page will explain a bit better http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
When you finally get your urls sorted out, google will sort it out also. in the mean time you may loose ranking, but in the end all will be sorted.
why are they redirecting back to home page? do you redirect them or are you still infected?
I would make sure they 404
I agree with chris, if its not needed why have it?
As for doing harm, Duane Forester from Bing advised not to do it, and said that sites that misuse the canonical tag, Bing will ignore them all together.
There is also the line of thought, that we know that canonical tags do not pass all link juice, just like 301's or any request, there is a certain amount of decay, 15% in the original google algorithm.
It just may be that when you have a canonical back to yourself, it is followed and that you get that decay unnecessarily
I see it like this, its a bad experience to find that a item is out of stock, the 404 will remove you from the index.
A site where you have a lot of products is probably not be ranking well for ever product page anyhow, I would be getting category pages to rank. and not be worried about a few lowly product pages
Probably not, you have only a handful of post and this is not a problem as far as duplicate content goes.
if you want them all to rank, then don't canonical them a only one will rank, try adding a paragraph of unique text to each.
fix the links, so that they point to the correct url.
We all have them, no don't worry, google would have ignored them long ago.
Canonical pages don't have to be the same.
it will merge the content to look like one page.
Good luck
Ok if you use follow, that will be ok. but I would be looking at canonical or next previous first
That's correct.
you wont rank for duplicate pages, but unless most of your site is duplicate you wont be penalized
There is nothing wrong with having duplicate content. It becomes a problem when you have a site that is all or almost all duplicate or thin content.
Having a page that is on every other competitors site will not harm you, you just may not rank for it.
but no indexing can cause lose of link juice as all links pointing to non indexed pages waste there link juice. Using noindex,follow will return most of this, but still there in no need to no-index
If you no index, I don't think Next Previous will have any affect.
If they are different then and if the keywords are all important why no-index?
no use having the canonical as well.
iwould be looking for links within your site that point to https://findmover.in
also do you need to use https? if your site does not need to be encrypted, then it would run faster as http