This question has been asked many times, and i would say that they are the same if linked well. i would cross link them so that they do not look like seperate sites.
Posts made by AlanMosley
-
RE: The URL Inside
-
RE: Have I been mauled by the Panda?
My expirences are that when you rank low in the hundreds, even one or two links from my own c-block will give a big boost, or cause when you enter the top 50 thinks get much harder. I have added a lot of links, a lot of them are just profile links from microsoft news groups, and it has not budged.
i have pointed my profile links at many pages before and made them move.I have seen sites drop out before , but only temp, its been the best part of a month now.
The sub domain I metioned has been removed from google, I checked and it is constantly showing no pages in the index.
I did see a temp small jump in rankings, so i am still hopeing although i have read much the same as you suggested, that moving duplicate content to a subdomain is a fix and suggest its not the problem.
As for the links, I have anouther 2 site, that have much teh same links, one has the same subject matter, and they rank much better. i would think that they would suffer the same fate.
-
RE: Does a page on a site with high domain authority build page authority easier? i.e. less inbound links?
If i understand you correctly, if a home page has high Authority, will your listing on that site rise quickly if you link to it, i have heard it will.
but usealy these sites have hundreds of links on the page pointing back to their own sites so that the link pointing to your site will only get a small slice of the juice.
On the odd occasion where you have high authority, few links on listing page, then it may be worth it, but i doubt it is worthwhile searching for.
Link juice is not all a link can give but, relevancy of linking page and link text is worth chaseing, so rather than trying waht you present, i suggest just point all links to your site and get the juice , link text and page relavancy of all the links. -
RE: Is there a way to manually update the keyword rankings tool in SEOmoz?
If there is i have never noticed it
-
RE: Do facebook shares of deep links help the whole domain rank better?
Yes any value the link give is spread around the site, but of cause the first page gets the most
-
RE: How does someone rank page one on google for one domain for over 150 keywords?
Well if the keywords were "fsafsdaf" and "iosdfosdjfsadlf" and teh like you may have a chance
-
RE: Have I been mauled by the Panda?
At the time of first posting I had 500+ links I now have 700+, for the last few years up
until the problem I had 100+, see imageBut your whole argument that all of a sudden I would drop from first for my business
name to almost last when my links have not changed is not realistic.What facts? You haven’t presented any, you came in with a flawed argument, now trying to alter the facts to fit it.
I have heard you theory, and your thinly disguised abuse and dismissed them both
-
RE: How to tell OSE about new backlinks
My observations are then if you add a site with existing links, OSE then trys to add pages that link to that site, as they seem to apear the next crawl, but are not theior already
-
RE: Homepage outranked by sub pages - reason for concern?
Please take the time to read this page, it has a calculator you can play with also, it will make things clearer I'm sure
http://www.webworkshop.net/pagerank.html -
RE: Have I been mauled by the Panda?
look at image on last thread, i have 500+, i have 700+ now as i have been redirecting some. when i ranked better, i had less links.
but i thing your missing the point,
My site did rank much better, not having good links can explain bad rankings, but not a drop in rankings.
if you look at this search, you will see thatsitsolutions.net.au on first page or top of second, i made that site about a week ago, it has no links, and almost no content, yet my real site is almost last in the whole index.
I used to come up first. you cant explain that with bad links, i mean they are better now then they were then.
Content is all orginal on root domain
-
RE: Link Blocks
I doubt it, there are 254 ips in a c-block, but there is 65,000 in a B and 16,000,000 in a A.
Now considereing that 1 ip number can have thousonds of websites, such as discountASP hosting. the chance of gettiing a link from the same B or A are very high, exspecialy in teh same city.
I believe that the whole c-block thnk is over blown for these reasons
discountASP is a huge hosting company, yet they run all website on one IP number.
You can in theroy have 14 billion ip numbers on your network using nat translations with only one external ip number, using host headers the number is infinate.So while I beleve that SE's take c-blocks into account, i dont think its too much of a problem unless you have a high percentage.
I have this problem because i build and host sites myself. so its of limited use my putting my link on each one, infact it could be harmfull.
I wonder if google takes this in to account, that many like me that develop websites and host them have this problem.
-
RE: Generating 404 Errors but the Pages Exist
I also see this now and again, but next crawl they fix themselfs. i assume robots can not always reach page for a number of reasons
-
RE: Homepage outranked by sub pages - reason for concern?
When searching for site:mydomain.com i also asume that it is in order of rank, but i cant say i know that for a fact.
I would look at my internal linking
do all pages link back to home page, do they liink back to mydomain.com recommended or mydomain.com/default.htm
do you have a full sitemap on every page, I recommend not.
If every page links to every other page, this keeps link juice even on every page, when really you want to give your home page or landing pages prominence.
The best structure is flat, home page links to every page, and every page links back to home page and landing pages if you have them.
-
RE: Have I been mauled by the Panda?
Gianluca
I have done a bit more reading and have found that moving to a subdomain has been promoted as a fix rather then a problem, but as many have claimed subdomains can be treated as seperate domains or true subdomains depending on linking and the such. so i am still wondering.
anyhow the old subdomain was outdated and removing it was not a problem.
The only other thing i can think of is many of my links are same ip, as they are from sites i built or done seo work for and also host. If panda has tighted up on same ip this could be a factor, but then, they have always know this and i dont think the numbers are that great.
I am watching the page count of subdomain in index, as page count decreases I hope to see a rise in rankings, I am also going to try and lift the ratio of same ip to true external links
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
This info is really not browser dependent, just displayed differently.
But as i stated elswhere, if you PM me the Url i can give you a site wide report that will show you any cononical problems, or any problems for that matter.
-
RE: How many local citations directory listings are recommended to obtain a month?
My opinion, if i was working for google i would expect diretory links to come in a wave, as they are manualy placed, tehy are not organic. so i would not be too concerend, but then i would not want to look like i was using submission software.
to be safe do it slowly, there are only so many local directories, you dont really get to do it again.
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
If you want Robert, if you PM me the url, i will give you a site wide check
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
Private Message Ok, should of been obvious
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
Sha, what does PM stand for? Am I missing somthing?
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
just a point, you dont need to do a 301 in the .htaccess file.
I work with Microsoft Technolgies, and we dont use them, .htacces is a linux appache thing
-
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
Thwe SEO Toolkit sees the same probnlems as Bing sees, you need windows and you need to install IIS (add features) first
http://www.iis.net/download/SEOToolkit -
RE: Sitemap generator and /index.htm
Remove it.
301 will take care of it, but it should nmot be there.
also it is worth mentioning, that sitemaps that are incorrect will get ignored by SE's, I knw that this is true with
Bing buy talking with duane forrester, it the updates dates dont add up, the links give 401''s and the lot, they will start to distrust it -
RE: Www and non www how to check it.......for sure. No, really, for absolutely sure!!
Not quite sure I understand what you want to check, but as long as one 301's to the other it does not really matter. it may take some time for SE's to catch up.
Are you saying you want to check if it is resolving correctly?
in IE click F12, and then select Network and start capturing, you will see if its useing a 301, or a useless 302.If you want to prove to your client that the developer is not on the ball, do a scan with the SEO Toolkit and show the results, if you dont have windows too install it on, i will do one for you.
-
RE: Blocking URL's with specific parameters from Googlebot
Good to hear, I am glad you perservered
-
RE: Have I been mauled by the Panda?
how does not having many links send you from top to bottom for my bussiness name?
Nick if you find my last post I show that i created a second site, with no content just a link to the first site, with no links, and it ranks on the first page for the business name.
Sure the linking sucks, it has over 500, but a lot are signitures from microsoft forums.
I apoligize if my answeres sound a bit abrupt, but the last post sunk into the same debate.
but remeber it ranked number one for my business name for the past few years, now almost last.
Long tail terms where i am the only one to have exact match, i am the very last result. I dont care if othes have better linking profile than i do, having the same words spread thoughtout your page, but about a different subject should not rank you above the exact match.
I never did rank well for the compeditive keywords, but i was still in the top 10% of results, now i come absolute last, not near the bottom, the very last result.
Even if i was always last, you would wonder why the test site i made ranks on first page, with no links.
i asked google and they say there is no MANUAL action taken agaiinst my site.
Lets asume you rank #1 for your business name, tomorrow you are last, and I suggested that you need more links, would you accept that as an answer?
-
RE: Have I been mauled by the Panda?
Nick i have been though all the check your links and your on off page stuff, i dont want to get stuck in that debate again. I am no novice, its not a case of my wanting to rank higher and being disapointed, its case of a dramatic change of rankings. when you come almost last for your business name from first a few moths ago when i last checked. to come last for a term you are the only one in the world to have the exact macth, its more then a few broken links or so.
Yes the no-index stuff is a good point, but i checked all that, I have no no-index or no-follow tags at all, i have the basic robots.
I must also add, that I have destroyed the evidence and made a new site in my panic, i have also 410ed the whole sub domain
Gianluca, i had doubts myself as i have anouther sub domain that does not seem to be affected (hard to tell). but i just read on the google forum about someone who did have a simular problem or should i say thinks he has.
-
RE: Root domain registered in search engines, inbound links to www sub-domain. A problem?
no its not, you do lose a little from 301 redirect but very little
-
RE: Serp tracking - legal?
Well Google collects our data, why should you not collect theirs.
Put a button on your web site but dont make it easy to find, add some small print including your terms of use, and a disclamer that if search engines dont want their data scapped they can opt out by pressing the button, make them fill out a few forms, verify themsefls, make them wait 4 weeks then send them a no-reply email that does not address their concerns properly, so they have to go thought it all again.
-
RE: Root domain registered in search engines, inbound links to www sub-domain. A problem?
I prefer the non www, as www is not nesasary, but that aside.
Yes they do differentiate
just 301 redirect one to the other, and you will be ok.
-
Have I been mauled by the Panda?
I mentioned my problem a few nights ago, but since then I think I may have found my problem. I have a site that was never really promoted to any great level, but for the main keywords I could find it in serps with out clicking though too many pages. I came up first for the company name in both Bing or Google.
Recently I finally decided to promote it and did a bit of a ranking check. First I still come up first for my company name in Bing, but in Google I come up 900+ out of 1000, virtually last. For my main keyword, the title of my site and optimized well for, I come up last, absolutely last. For long tail terms from my home page where I am the only site in the world to have the exact term, I come first in Bing, and absolutely last in Google. I don’t do black hat, but I thought I must be flagged by Google and I asked for reconsideration, they replied that no manual actions had been taken again the site, and referred me to the usual Google guidelines.
It was very frustrating, I then had a thought, I had a long forgotten sub domain that had a load of duplicate content, it was a load of Microsoft documentation and other dev stuff from other sources, rss feeds and the like. Nothing sinister, but duplicate all the same. I am now thinking that this maybe my problem. I have 410’ed the whole sub domain as the site has not been maintained for some time anyhow.
Does anybody know of simular, sub domain causing loss of ranking for root domain
-
RE: Forum Marketing
Just of the top of my head, get a expert in the field to answer questins on the forum for a night. If you dont have many users, aim low to start with.
Once you get a bit happaning then you will gain links from people linking to posts.
-
RE: Blocking URL's with specific parameters from Googlebot
Yes correct, did you try the other formats?
-
RE: Would other TLDs (Top Level Domains) be helpful?
Yes but still not as good as promoting the one site.
You seem to like the local idea, so go with that, if at some stage you think its the wrong stratigy, you can always stop promoting all, and concentrate on the .com only -
RE: Blocking URL's with specific parameters from Googlebot
Wrong place, go to diagnostics, then look for fetch as googlebot
-
RE: Would other TLDs (Top Level Domains) be helpful?
Yes it will, but when you have so many sites, how to get the links.
Getting quality links is hard.What is best, getting 100 quality links each for 5 sites, or 500 qualitty links for 1 site, i think the latter.
so we have a trade off, more links, or the benifit of local?
Toss a coin
-
RE: Blocking URL's with specific parameters from Googlebot
Try this in robots.txt, I did not think that Google allows wild cards but i just read that they do.
Disallow: /*mode=vote*
or
Disallow: /*mode=vote
or
Disallow: /*mode
Then try in Google WMT to read with googlebot to see if it works.
The first in the list seems right to me, but I have seen others do it the other ways.
-
RE: Keyword placement on home page or throughout the website
Sour comment Sites do not rank pages do, I understand what you mean, but it is a bit misleading.
I am sure Rand has preached about site theme helps you rank, but my point is that links do help you rank, having relevant pages pointing to your ranking page helps, exteranl or internal, make sure your linking is clear as to show the search engine what page is the the imporatnt one.
-
RE: Keyword placement on home page or throughout the website
I would not worry too much about keyword being on all pages, but rather make sure you do not canablize in your linking, decide what page you want to rank, and make sure any pages that are relevant link to that page with keyword in link text., but do not link away from your main page with that keyword in link text. Since you want the home page to rank for that keyword, every page in your site should link back to home page anyhow for best linking structure.
Search engines will try to work out what pages is best, if you link as i suggested it will easerly be able to make the correct decision
-
RE: Blocking URL's with specific parameters from Googlebot
dont think you are going to do it in robots.txt, rather do a 301 from mode=vote to non mode vote.
If you dont know how to put this into practise, tell me what your site is built with, if it is ASP.NET, i will show you how to impliment, if not someone else should be able to help.
-
RE: Setting up of 301 redirects
It does not matter, 301 will transfere most of your link juice, but you will ose a bit, 301's leak a little each time.
but what i would do, rather then 301, is simply link from all relevant content to the page you want to rank with relevant link text, SE's will see that that page is the important one for that term.
If it is duplicate then i would remove it
-
RE: Would other TLDs (Top Level Domains) be helpful?
That will get you around the duplicate content problem, but how about seo for so many sites, are you in a commpeditive industry?
-
RE: Would other TLDs (Top Level Domains) be helpful?
This is a hard one, going local is always better, global is hard, but haveing duplicate content is not going to help, if you use a 301 or canoical tags only one site is going to rank.
I would think that duplicate content in anther lingo is usefull and should not be a problem, but in english it would.
Maybe a site for each lingo, not country
-
RE: Pagerank and 301s
I changed domain names on a site and it took months to come back.
i would check to see if the pages have links, if the links are worth it, 301, if no links or crap links, just 404 it and start a new, 301 do not pass all link juice, they lose a bit each time, and it can get a management problem over time.
-
RE: Nofollow all outgoing links?
all links.
1st is best and descreses as you go along
The position of the link also counts. SEOMoz did some tests on this, (i think it was SEOMoz) footer does not give much juice, sides a bit better, head and body were the best.
-
RE: Is my SEO guy bad news?
Bad english? Did you hire someone from india?, i have been down that road, I finaly found one that understands what a quality link is.
Blog posting is next to useless i believe.I think it was Matt Cutts or Duane Forrester who stated that blog posts were not worth it, but good relevan of local directories were when mixed with a few quaility links. This is the path i am following untill i have evidence that differs
-
RE: Decreasing the size of a site to increase SEO value of remaining pages?
No i would not remove pages,
How pagerank works is in simple terms, image that every page is awarded 1 point, you have 3 pages your site has 3 points, you add 3 more pages , your site has 6 points, but divided by 6 pages, back to where you started. but by internal linking you can pust that 6 points onto one page, you may end up with 1 page with 3 points and 5 pages with 0.5.
What i would do, is chose a page you want to rank for a subject, term, keyword. then all other pages that have relevance i would link to this page.
I once again suggest reading this page
-
RE: 3 weeks after PR8 is Implemented
3 weeks is not that long, although a pr8 site would get crawled quite oten you would think.
I have found that Googel has become very slow at awarding juice of late. I know that WMT is not the same thing, but that can sometimes be months behind.
there is nothing wrong with being a contributer link in itself, rather are you contributing to a site that is relevant to yoursite
-
RE: Nofollow all outgoing links?
You should always link out in my opinion and SEOMoz’s also as it states in the on-page report.
Matt cutt has said that it can be beneficial to link to relevant sites, but there is also another reason.
The Google page rank algo, does not award page rank to what is called hanging pages, that is a page on your website that has a link to it but does not link back when it is calculating pagerank for your site. If your whole site does not link back, to the internet then maybe it is seen as a hanging website in the bigger picture.
I assume that linking out give you some sort of juice for relevance and authority of the linked-to page, this may be better than the juice you lose.Another idea is to make sure you have plenty of links to your own site on the page also, so that you are only giving away a small slice. you may notice that article sites do this, they have maybe 200 links to their own site, and one to yours. Also the first link in a page carries the most weight
-
RE: Is there an Rss feed for the Q and A section only ?
I doubt it, as most of Q&A is for memebers only, but it would be good content for your site if there was
-
RE: Internal link to the home page
Along with Rayns comments, http://www.domain.com is much more presentable but the main reason is canonical issues, http://www.domain.com/ and http://www.domain.com/index.htm are 2 different pages in SE eyes, and any rank awarded to them is split.
One reason home pages rank so well compaired to other pages is internal linking, if you link to every page from the home page, and back again, your websites pagerank is pushed onto the homepage, if you link back to index.htm you waste the pagerank.
for a better explaination read this link, i think any one interested in SEO should read this page