I thnk you are correct
http://www.mattcutts.com/blog/seo-advice-discussing-302-redirects/
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
These were reviews from BBC, NYT, CNN and NPR
This was done using
http://data-vocabulary.org/Review">
NYT, BBC, NPR and CNN
not a count of reviews, i will give it a few days to see if they come back.
I am a perth seo, and the ones on my site are fine.(at the moment)
Are you also from Perth Bob
If you have 2 tld's .uk .au, you will not have to worry about duplicate content, but you should have regional set correctly, data format, currency and that sort of thing. You might want to host one in Australia.
I just had a read, about the noarchive, I found where http://www.seroundtable.com/archives/017128.html
Where it is claimed that Matt Cutts has said there is no penalty, BUT, if you have spammy signals it will be another signal.
so there is some truth in it, but only if you ae already a bit spammy.
I also found this video
http://www.youtube.com/watch?v=XhrZKejdmEE
Matt mentioned it here. if you have been hacked, they may show he hacked page only to googlebot, so this can not be checked they also add noarchive.
so if they suspect a hack, and you have noarchive, you may have a problem, but he also stated in anouther videio that they will tell you you hhave a problem.
i would not be in too much of a hurry.
if you get 10000 in the next few months and then none for the next year, then it will look strange. try and keep it consistant over time. If i was a search engine this is what i would be looking for
Yes its seen as too different urls
http://perthseocompany.com.au/seo/reports/violation/the-page-contains-multiple-canonical-formats
If you are uisng a windows server (IIS), you can fix this easy by using the IIS url-rewrite addon. it had a rewite as lowercase preset
Yes i just had a look as i removed some html from that page, thought i may of taken some of the wrong cope out, but no.
But i did find that i was messing around and added a empty organization snippet see below and forgot to remove it, it was just above the review, maybe it upset things
http://schema.org/Organization">
but then the stars still apeared in rich snippet tool, so I doubt this had an affect, but i have removed it and i will see.
I am from parkerville up in the hills, i have been suprised to see so many Aussies on the forum.
Microsoft Azure, has CDN in every continent, works out a very cheap
Along with Rayns comments, http://www.domain.com is much more presentable but the main reason is canonical issues, http://www.domain.com/ and http://www.domain.com/index.htm are 2 different pages in SE eyes, and any rank awarded to them is split.
One reason home pages rank so well compaired to other pages is internal linking, if you link to every page from the home page, and back again, your websites pagerank is pushed onto the homepage, if you link back to index.htm you waste the pagerank.
for a better explaination read this link, i think any one interested in SEO should read this page
IMO
Relevancy is much more important. of cause the best is to have both. but think like a search engine, they want to see wujho is recomeneding you for a subject. A relevant link in the content is hard to beat. i non relevant link in the footer is priobably worthless.
sorry i may of mis-understood.
Are they duplicate content? if so i would do as i suggested,
If they are indervidual press releases, then why are they being reported as duplicate. You need to add enouth content to make sure thet they are seen as indervidual.
I can not load page its seems to be offline or some problem, so i dont understand how you are using pagnation,
Many say that sub domains are sperate, but i have never seen any evidence for this. i have used subdomains and find that they act just as sub-folders. Matt Cutts has said using sub-folders of sub-domain is a matter of personal chioce.
But many people have been moving their duplicate content to subdomains to aviod the panda update effect. I have no idea if this works to protect the root domain long term or at all, it just may take time for google to find it and call it as duplicate again.
i would go with the .net myself, why risk it, time is money.
As for the 301's, if you are keeping the same site structure then you can ddo this with one 301 redirect, if not I would go to WMT and find the pages with incomming links and 301 those pages only, there is nothing to gain from 301ing pages with no links. Ok maybe someone has a url bookmarked and may not find your new site, but it not too likly thats going to be a big problem
You probably have a US ip number, you generaly rank for the location of your web site.
you can have a look at this google post on the subject may help
http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
You should always link out in my opinion and SEOMoz’s also as it states in the on-page report.
Matt cutt has said that it can be beneficial to link to relevant sites, but there is also another reason.
The Google page rank algo, does not award page rank to what is called hanging pages, that is a page on your website that has a link to it but does not link back when it is calculating pagerank for your site. If your whole site does not link back, to the internet then maybe it is seen as a hanging website in the bigger picture.
I assume that linking out give you some sort of juice for relevance and authority of the linked-to page, this may be better than the juice you lose.
Another idea is to make sure you have plenty of links to your own site on the page also, so that you are only giving away a small slice. you may notice that article sites do this, they have maybe 200 links to their own site, and one to yours. Also the first link in a page carries the most weight
A waste of time.
Write some thing really good about your subject matter, either host it on your site and attract links, or find a good place blog to guest post it that allows links back to your site. (not a article posting site, they are useless)
do you mean pass a title in the query string so they have indervidual titles? yes that would be a good idea. Duplicate titles is a waste of prime SEO real estate
if when you choose a different page, the content changes significantly then of cause DO NOT use canonical tags.
I am one that believes that there is no difference between subfolders and subdomains.
there are quotes from matt cutts saying there are no differences, another indication is the google results, sitelinks show links for subdomains.
“What’s the difference between using subdomains and subdirectories? When it comes to Google, there aren’t major differences between the two, so when you’re making that decision, do what works for you and your visitors. Following PubCon, our very own Matt Cutts … on his personal blog. In addition to those considerations, if you use Webmaster Tools (which we hope you do!), keep in mind that you’ll automatically be verified for deeper subdirectories of any sites you’ve verified, but subdomains need to be verified separately.”
http://googlewebmastercentral.blogspot.com.au/2008_01_01_archive.html
"Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service."
http://www.mattcutts.com/blog/another-two-videos/
The good think about using TLD's likee .com.uk or .com.au is that you dont need to worry about duplicate content, Matt Cutts has said as much.
But if you cant do that . i would look at this page
http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
Remove it.
301 will take care of it, but it should nmot be there.
also it is worth mentioning, that sitemaps that are incorrect will get ignored by SE's, I knw that this is true with
Bing buy talking with duane forrester, it the updates dates dont add up, the links give 401''s and the lot, they will start to distrust it
Well we may all see a rise in rankings as thier users take a hit, good news.
Let use know if you see some evidence of incresed rankings. as this was not the only netword to have ben taken down latly
Clcik start capture, then when youload page it will list all requests and their status codes
This really sounds like the hosting company moving blame to you.
Why not change hosting, i would do it quickly.
Do you have a image upload? seing the pages are in your image folder.
If so you can add some code to make sure it is actualy images that you are uploading.
James is correct, is a lot of effort that you could be putting into your primary site.
If some o ne them have good exact match keywords in the domain, them it may be worth the time.
I have a few sites, selling the same product, but I did this because I already have my primary site performing at its optimum, so them putting time into a second site to crowd the results makes sense, but I would be trying to get the first site to the top first.
Your robots.txt is blocking 570 internal links.
You also have 442 unnessasary redirects. an example
The link to "http://www.psbspeakers.com/articles/HCC-Imagine-home-theater-review" has resulted in HTTP redirection to "http://www.psbspeakers.com/articles/HCC-Imagine-Home-Theatre-Review".
you should 301 redirect to all lowercase, and you should make sure that all yout internal links pioint to the lowercase version.
17 likes since 2009?
12 tweets?
It seems like not even the owners family and staff could be bothered.
Where is the evidednece here that news releases work?
No one is claiming that they never ever work. so even if you could produse a few good examples, it is hardly evidecne, but It is my opinion you are better placeing the content on your own site.
I'm 50c each way to tell teh truth, the more i think about it
Buying links is a No, No
But Matt Cutts has stated that paying for a review is ok. In fact a directory that charges for a review and does not accept just anyone is better.
I would backup your htaccess file.
then remove half the rules, then test. then you will know what half it is in. keep doing this to you find it
I would be more worried about stuffing into one tag, not so much how many products have t-shirts in the name.
dont have a H1 with "t-shirt t-shirt t-shirt t-shirt t-shirt", but i think it is ok to have several tags each with t-shirt in there sonewhere. It is quite exspected and natural to have several producst with simular names.
As for self canibalization, picture this, you have a category page, with links to t-shirt produits with "t-shirt " in the link text, but on each product page you have a link pointing to your category page with t-shirts in the link text, you will actauly rasie the PR for the catagory page as well as give relevance for the term t-shirt to the category page.
Another good answer from Matt.
if you are talking about one or 2 links then dont worry, do what is good for your users, if it is somthing that is on mass trying to trick the search engines, then i would think again.
Hi urban fox, i think we talked about a simular topic a while ago. I have got a bit more info lately from Matt Cutts.
If you have domains hosted in different countries, lets say .au and .uk with duplicate content, that is not a problem, but the sub directories with duplicate content is.
He did mentiion that you should have your dollar and pound sybols as well as date formats and teh like correct for each tld.
Good answer.
The only time I would do anything with an extra domain name is for testing, and if you have a exact match domain, then maybe it may be worth using as a landing domain.
Isthere any sign of penalty, a huge drop in rankings? if not then I would not worry too much about the links, they may be worthless but it si doubtfull they are doing harm,
You can ask gogole for reconsideration from GWMT, but i doubt that is the case unless your ranking has shown a huge drop. but it does not hurt to ask. i would be more worried about on page stuff. make sure your site is not hiiding keywors and the like.
i found the ref to how to use the url attribute
http://schema.org/docs/gs.html#schemaorg_expected
see "Using the url property"
Yes page and domain authority are the way to go along with link text and page relevancy.
OSE dose not crawl the whole web, so you will not get all the links, it does try to get as many quaility sites as posible and the bestter pages from those sites, a missing link is proabbaly not a top notch exelent link, but it could still be a good one. There is no backlink report that shows all links, so you have to make do with a sample.
I have not tested this, but It could have merit,
puting link in the footer may limit self canabization, and you will be getting more back. Except I remeber that links back to home page do not pass much credit for link text.
http://www.seomoz.org/blog/testing-the-value-of-anchor-text-optimized-internal-links
simply linking from your home page to moe then one child pages and back will raise the page rank of your home page but lower the PR of the child pages
this pr calc will show that
http://www.webworkshop.net/pagerank_calculator.php
I think you are right in the first place.
If the article is a good one that they would like want it on their site, then I think you should keep it on yours. The amount of traffic you will get from a article buried somewhere on such a site would be very limited if any IMO.
Do you mean the numbers show in the serps? Do you have description meta tags?
Search engines used to use the description meta tags but these days they pick and chose when to do ao, if you are getting them in the serps then they are making a bad choice. this will hurt your conversion, as people dont liek to ckick ona messy serp.
i went to your site,. netbet and found many brocken links, this is proably a bigger concern.
Well you would want to make the page link worthy, i know that is not always easy to do, not everything can be wow whoopy exciting.
You can also make sure your internal links use the keywoprds in the link text, and link to it often from othe rpages, do go un-natural, but makes sure the link juice flows to it
This page may be a help
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
looks like you have dug yourselfs a hole.
you could with a bit of work, detect where the vistits have come from, and then show prices relevent to the vistis, you will not be done for clocking in this case. There is geo locating solutions out there for detection, and you could write a function that can sort price and then replace all the prices with the function something like @GetCorrectPrice(19.99)
As far as i can tell about clocking by the way, you dont get dome for clocking unless you are trying to decive, if they detect clocking the will have a human look and see why and what you are doing.
Maybe, but I would put that one on your site also. The page will probably end up so deep in the site that the links would carry very little link juice, there will probably be so many links on the page linking back to their own site that very little would be shared with your links.
On the other hand, adding a page to your site will boost page rank just by existing and with smart linking will increase your homepage rank for ever. See http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
i dont think any of those points is giving him high rank.
reciprocal links, are ok, if the sites are relevant and of aithority.
No follow are not going to help, and article posting is quite worthless according to google.
i would say that he has a few good relevant links that may not be obvious.
Will this lead to hords of untidy spammy facebook comments?
It is almost certain that the penalty is carried over. If i were a search engine i would ceratinly make sure thet loop hole was not avaialble.
If you know the domains that are affected, you can redirect conditionly to fic the problem.
You have a few link juice leaks thought un-nessasary 301's, every 301 leaks link juice.
This one occuers about 1,000 times that i found.
The link to "http://www.carbodypanels4u.co.uk/catalogsearch/result/" has resulted in HTTP redirection to "http://www.carbodypanels4u.co.uk/".
thats a fair bit of leaking that should be going back to your home page
You also have over 1,000 images without alt text, get some keywords rich text in them.
Sound like they are worth nothing, He is giving them footer links thart are not relevant, i doubt they are giving him any better back
I dont think it was the Freshness update
The page rank is you see in your toolbar is updated about once a year.
Over the last year you may of lost links, maybe in the panda update. Panda got rid of a lot of lo0w quality links
A web site would have to be configured to acceppt that domain using host headers or to point taht domain to a interanl ip address so except for the unusealy setup that trick would not work in the vast majority of cases.
If 301ing good links works, then why not bad ones.
Yes add a filter so that your hits dont count.
click edit next to your domain name