What you are referring to is deep linking its normal as it helps build authority over a whole domain rather than just the homepage however its a tricky line to walk., Same for directories it can be a fine line, its not Whether you think they are spam directories its Whether Google thinks that and Whether you are willing to bet a penalty on it (or a competitors reporting you for having directories and again getting a penalty)
Posts made by GPainter
-
RE: Has anyone ever used a specific page (instead of home page) when building links in directories?
-
RE: Duplicate content errors
You need the full url so the tag should look like-
<link rel="<a class="attribute-value">canonical</a>" href="http://www.letspump.dk/produkter/5-aminosyre/" >
-
RE: Rel=prev/next and canonical tags on paginated pages?
Yes, which is why I thought the first page might be a bit more helpful as a reference point.
-
RE: Rel=prev/next and canonical tags on paginated pages?
Okay, technically you should have a "view all" page and canonical to that which is what that is referring to, as you've got so many pages it is still possible to do that but may suffer from load times etc. So if you were to do it by the book you would ahve the rel=prev/next etc. and a view all button which lists all the content you would then canonical to that.
-
RE: Rel=prev/next and canonical tags on paginated pages?
It shouldn't matter how many pages though it might be beneficial to categorize them or similar to help users but you can canonical the first page or you can canonical a page that's the same or very similar.
There are many helpful facts on the link above.
-
RE: My www. domain has less page authroity than my non www.
there is a difference between http://www.example.com | http://example.com and http://www.subdomain.example.com | htttp:subdomain.example.com
You can still redirect and it will work the same its just a bit more out of normal. You will still see some juice pointing towards the old site but realistically its getting redirected to the new site, a little confusing I know.
No if you've got sub domains for translations you don't want to 301 them you want a tag - rel=alternative this will stop duplicate content but will let the site be a bit more independent as well as keep content where as a 301 would show the same content and not be as helpful for the users.
in short
- 301 www. or http into the counter part
- use rel=alternaitve for sub domains that are for alternative languages etc.
- Don't panic about one having more or less authority it can be change as you work on an SEO strategy.
-
RE: My www. domain has less page authroity than my non www.
I think you are confusing your self a bit there. Once you 301 redirect it, all the authority is redirected too. you don't need to worry about putting a page or anything on it.
Google will re-crawl the links find the link redirects and then move the authority over to the new location.
-
RE: My www. domain has less page authroity than my non www.
You can have either www. or none it really doesn't make a huge difference. When you find one you like you can redirect the other into it and tell webmaster tools which is you preferred domain. So find your preference of domains (it's not a big deal which has more or less PA though but if one is getting links naturally its probably a good idea to stick with it).
The redirect will fix any issues so don't worry.
-
RE: XML Sitemap Generators
The following post might be helpful - http://moz.com/community/q/online-sitemap-generator
You can also use Screaming Frog to create sitemaps.
-
RE: Rel=prev/next and canonical tags on paginated pages?
First off you might find this page handy - http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
Canonical and pagination are the same (sort of) so you don't need both.
Canonical is when you've got a few pages that are the same and you're telling Google these are all the same but here is the original.
Pagination is telling Google these pages are all the same but they are in a sequence here is the first and here is the last page
Now there is no harm having both on a page especially if you've got some parameters, You should be safe plus duplicate content is not the worst thing to face and it's not going to cause that much harm if you've got a couple of pages duplicated.
Hope that helps.
-
RE: Google is showing 404 error. What should I do?
Screaming frog crawls your site "live" Moz & Google don't so it should show you any current 404 errors. the 404 is showing because the page is not there, you would 301 a 404 page to change it from a 404 to a 200 code.
IN short a 301 is to make it so a user lands on the page and rather than find nothing they find something.
Are you sure Page "A" is working and are you sure its page A that's the problem not something odd like page "A.php" etc. ?
Run screaming frog and see if that helps with finding 404's
-
RE: Google is showing 404 error. What should I do?
You can use Screaming frog to find the 404 pages, from there you can either 301 them into the correct pages or a very similar page to help prevent the issue. You should also create a soft 404 page this is a page that shows a message like "opps we can't find your page" and then directs users (and bots) to a similar page or a homepage so its not such a dead end.
I'm a bit confused as what you mean by "what to do" You can either fix them by putting content on them or 301 them to the correct page.
Hope that helps.
-
RE: Onpage grader
The ticks mean that is fixed on your site "already present" so that's good, more info can be found here -
http://moz.com/help/guides/research-tools/on-page-grader
As for the how to fix issues it should tell you just under the problem how to fix it. If there are any you are struggling with let us know and i'm sure we can help/
-
RE: Can anyone help me detect some SEO improvements onpage please...
Have you tried the Moz page grader ?
Screaming Frog is also a handy tool for your SEO tool belt
I highly recommend you try the page grader..
-
RE: Is there a way to get your local SERP by zipcode?
As it seems a few people are responding with " I don't knows" here are some examples
Number 1 -
Number 2 -
A Proxy (HMA in this example)
Number 3 -
Number 4-
https://www.georanker.com/#/ <- Found after a quick Google
Hopefully one of those is helpful and no one has an excuse for "i don't know" Feel free to add any more you find helpful
-
RE: Is there a way to get your local SERP by zipcode?
Hi,
Sorry the Proxy is another thing try looking at Hide My ass
The Moz overlay I can't stand the chrome version one bit so not sure where it is ( I believe in the top area) so I'm happy with the Firefox version. Here it is in the Firefox - http://imgur.com/3XvXZad
-
RE: Is there a way to get your local SERP by zipcode?
You can try a proxy and use a proxy via region to view them. The Moz tool bar (I'm using Firefox) has a control panel that lets you customize the search results by region (though not sure how accurate this is).
I believe some tracking software can also do it for you too - Serpbook is one such.
That's the first few that come to mind, hope one helps.
-
RE: Does anyone else have issues with Moz's keyword search volume tool for Google's search engine?
Don't forget Google trends it can be another handy tool in your belt.
Some other handy tools are -
http://www.keywordspy.com/ - no experience personally with this one i'm afraid.
so you can put together an educated guess at volumes but there is no definitive list.
Good luck!
-
RE: Does anyone else have issues with Moz's keyword search volume tool for Google's search engine?
Hi Patrick,
Googles a tricky beast and it doesn't give up details easy now days so Roger uses Bing to help you with search volume, this isn't given to you via Bing to mislead you it is so you can get a rough idea of what the volume is to help you. You can still use Google's ad sense to get some volumes but there is no reason to believe even that is completely correct.
You can find more info here- http://moz.com/blog/keyword-research-and-targeting-without-exact-match-whiteboard-friday
You can even see in the comments I asked-
"Quick question you mention Moz's (great) keyword checker which also has search numbers including exact. How is this extrapolated? when I mean is, is this going to be incorrect with the new update too?"
"Our data comes via Bing, who still provide exact match data (and keyword referral data, thankfully!)." - Rand Fishkin
Hope that helps.
-
RE: How to use robots.txt to block areas on page?
Hiya,
First off the main answer is here - http://moz.com/learn/seo/robotstxt
an alternative solution might be use of the canonical tag meaning you're getting all the link juice rather than letting it fall off the radar. I wouldn't be overly worried about duplicate content its not a big bad wolf that will annihilate your website.
Best idea if you're worried about duplicate content is the canonical tag it has the benefit of keeping link juice where as the robots tends to mean you loose some link juice. One thing to remember those is the canonical tag means the pages will not be indexed (same as robots tag in the end) so if they are ranking (or getting page views) something to remember.
hope that helps.
Good luck.
-
RE: Sending Domain Authority from Root www domain to *
As well as LindaLV's advice (excellent) you can also set up your preferred domain in webmaster tools, you will need to add both www. and none to do this. Wordpress normally auto redirects the www.& none, a quick test of this would be to simply go to both www. and none www. see if it redirects.
If you've set up word press on e.g htttp:// you can go to settings>general and select the www. version but as i mentioned it should auto 301 in WP if not its easy to fix!
-
RE: What is the difference between to all Panda updates or algorithm?
the percentage is how many queries are affected, this means that 11.8% of them were affected (this can be negative or positively or resulted in a penalty etc.)
The reason there are panda 1 and 2 is its a bit like a software update it gets better so it gets a new number each time. If you want more in-depth things on the panda there is loads of resource over the internet or Moz and only a quick search away.
Hope it helps.
-
RE: Will having duplicate content on four websites cause a problem?
use the canonical tag it will let you keep the pages where they are (as well as visitors) but tell Google which page is the original. As for harm it depends on how many pages I wouldn't see it being the greatest harm but its so easy to put the tag on you might as well do that. A quick heads up though by putting the tag on pages it means one page will rank but the others will no so be aware of that.
Hope that helps.
-
RE: Country sites - TLD's or sub directories?
Hi there,
First off this sounds perfect for what you want to know - https://support.google.com/webmasters/answer/189077?hl=en
Now as for TLD's etc. its preference - the bonus of having it on one site is all link juice accumulated goes to one domain rather then trying to optimize 3-4 different sites. You can always secure the TLD and if you need to move the site across at a later date. Whilst a regional TLD does give a minor boost it wouldn't mean much if you've spread yourself too thin across a few sites to really help the site rank.
My recommendation would be get your main site working with the rel=alternative tag in place for sub folders then you can work on moving one language to a new site (e.g. .de) get that site working really well then move onto another site etc. if you want to do that.
Hope that helps.
-
RE: Canconical tag on site with multiple URL links but only one set of pages
Okay you lost me a little but let me see If I can help.
First off the canonical tag - Its fantastic for duplicate content (even across other sites) now so good if you don't have duplicate content.
301's - It's very similar to above can work well with duplicate content but not essential. Now you can 301 a few pages into one page so if a user types a URL in (or even has it as a bookmark etc.) the will land on the page you want. its normally a good idea to 301 into similar pages to you don't get users thinking they are going to buy (e.g.) a pair of boots and land on a page about t-shirts.
Google getting lost - Don't worry about Google getting lost, if a user can get around so can Google, plan plan and plan again if you plan it all out (you can even draw flow diagrams) so you know where its all going to and from until you are happy. You can also get someone who doesn't know your site to test it see if they get lost.
Hope that background helps a bit, you lost me here-
"Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at."
Why can't you redirect all your pages to the target URL ?
One helpful tool I recommend is screaming frog it can help you pick up redirects 404 etc.
-
RE: Nominet have made the geographic new TLD available for UK. How will this affect SEO?
let's for just a second ignore the technical point of view I know how easy it can be to get caught up and look at it from a different angle-
Which option is easier to brand and for the user? if you've got a .co.uk you can secure the .uk and as you mentioned 301 job done no worries it will also stop some competition with the same name same results go for .com really.
You don't get an real benefit from any of the TLD's they all work the same so don't panic!
I wouldn't worry about Nominet they are not some large evil cooperation trying to get money, they are trying to free up domains to create more options rather than think wow there are no more URLS left on the internet!
My advice if you are worried is get the .uk redirect it into your all ready established .co.uk then if you want to at a later date swap you can, if a customer accidentally goes to the wrong domain again it helps and as I mention it also stops any competition.
you can find more info here -
http://moz.com/ugc/an-seos-guide-to-acquiring-new-gtlds
http://moz.com/learn/seo/domain
Hope it helps & good luck.
-
RE: Moz Local for the UK?
Hi Ben,
First the bad news, Moz Local is not coming to the UK any time soon it is on a road map meaning it may do at some point but it takes time for Moz to get all the tech sorted, get partnerships with local directories etc. etc.
Bummer I know but you can still do a lot of it manually. If you're all lazy there is a paid option similar to Moz here - BrightLocal.com I dont have any experieince with them but at least you've got the option.
If you're all in the Freebies YAY frame of mind why not check out the following blog post -
http://moz.com/blog/free-local-seo-tools
Hope that helps whilst you wait for Moz to get over here to the UK (Roger doesn't swim well so might have to wait for a bit)
-
RE: Reason of Keywords ranking up & down?
Panda 4.1 was released in the last couple of weeks and has been going up and down, as well as this penguin update is due this week some sites are already reporting issues with this too so it may be due to that.
more info here -
https://serps.com/tools/volatility
http://www.rankranger.com/rank-risk-index
nothing here though -
so expect some fun in the next week!
-
RE: Does the Moz toolset have a monthly reporting function yet? Can't seem to find it in the interface. I'm trialling the software but probably going to move to another service if this isn't available.
Keep your pants on there before you go and jump ship, did you go to the reports section and take a look? You will find that you can create monthly reports (there are a few topics on it if you want to do a quick search)
-
RE: Rel next/ prev
First off, I'm assuming you've create the tag correctly with a start and end page. Did you give the pages enough time to get re-crawled?
-
RE: How can i block the below URLs
Take a look over here - http://www.robotstxt.org/robotstxt.html and we can't forget the Moz version - http://moz.com/learn/seo/robotstxt
Alternatively you can just add meta noindex on the pages with with the added bonus of letting link juice flow better as well as being a bit more stern to robots ( recommend the noindex tag!)
-
RE: Recovering from Black Hat/Negative SEO with a twist
Sounds like fun!
I did write a lovely answer which unfortunately got lost so I'll summaries a bit below-
1. I wouldn't recommend telling Google as you might not have a penalty now but you might be temping Googles wrath
2. As you've not been marked as malware and you've removed it you should be fine but you can always try if you want to sleep better
3.Disavow proactively is a great idea Google like this approach too, It also means rather than hoping Google might ignore the links its will defiantly ignore them with the disavow list.. Further to this I've got two more options for you. you can block wildcard/dynamic pages in your Robots which will help stop Google even getting to them to find out you've got some bad links assuming you don't need the pages for your site. If you check your referring domains weekly and update the disavow list as well if you're still "under attack".
Just a quick heads up after disavowing the link you may drop down in rankings as you're removing the links however there is also a chance you can go up if you're under a algo penalty.
You can find some good tips here too - http://www.searchenginejournal.com/combat-recover-negative-seo-attack-survival-guide/114507/
Hope some of that helps and I wish I could of posted my reply but I don't have the time to rewrite it I'm afraid. Good luck to you!
-
RE: What is the longest you would go back to ressurrect links that should have been 301's?
...a man can dream,....
My favorite is taking the time to explain followed by silence...not awkward at all!
-
RE: What is the longest you would go back to ressurrect links that should have been 301's?
Really hope that they had a custom 404 page at least!
-
RE: Can you spot the differences?
Ah Sorry, I got the two muddled up! At least hopeful you've got a few more resources to use..
-
RE: Can you spot the differences?
One thing you can check is the following -
The rich snippets seems that you have not set them up...
-
RE: Can you spot the differences?
Back it up a bit, Have you tried spot the difference on the whole domain ?
How about some other factors?
http://www.cambio-gomme.it/marchi/michelin/
Total words: 1527
http://www.gomme-auto.it/pneumatici/michelin
Total words: 1850
Also a PR difference and robots etc. I'm going to be honest here there is no shortage of things that are different but I don't recommend looking at your competitors so much thinking what are they doing that your not not, if you do that you're constantly going to be following them and you want to be leading! (there is no harm in keeping an eye on what they are up to though)
you're going to get a headache if you try to work out why Google likes that site more than your etc. could be any number of things. Instead I suggest looking at ways to improve your site so its better than theirs, be it reviews, comparisons YouTube reviews etc. etc.
Make your site more unique better for users and share it a bit along the way and you will be surprised how quick it all falls into place once you start to (PPC can be a handy head start too!)
Short version
Other tools you can use :
- Majestic SEO
- Ahrefs
- Screaming Frog
- SEO quake
Hope that helps & Good Luck!
-
RE: SOS - I have done a terrible mistake: How can I make it up?
As Ray-PP & Silver Door have mentioned 301 is the best option here. You may well have to wait for Google to discover the new links but fear not, whilst you are waiting you can be pro-active getting in touch with webmasters who have the links on their pages and ask them to update the links. You can also submit some of your more important links to e.g. twitter to help them get indexed that bit quicker if you're in a really hurry but I wouldn't spam this and only use it for some of the more important areas.
As for the how to it really depends on your CMS system. As for which method, currently your in the same mess for which ever tactic you choose so find the URL structure you like and go for that.
Some more info on redirects can be found here - https://support.google.com/webmasters/answer/93633?hl=en
Good luck.
-
RE: Does adding a backlink in google plus comments work?
One thing it is helpful for is getting things indexed a bit quicker, don't expect it to help you rank though unless you are lucky enough to have a good social gathering and are getting good click through rates etc. Generally any kind of comment link isn't really good.
-
RE: Getting Traffic to a New Website - Looking for Ideas
A good social media campaign can really help -
-
RE: Hi, I'm looking to find out why a google+ account that was rarely used has 10,000 views. I want to discover what sites it is linked to. I entered the page url but no joy. can anyone help?
Profiles like that tend to be used by spammers etc. you can tell by lack of activity, profile picture is used to heck it could be getting views from other sources used by spammers etc. stumble upon or Fivverr etc. I wouldn't worry too much about it, its not a legit profile. On top of this the view count is only a rough indicator. I also don't know if bots raise the view count everyday etc.
Sorry I couldn't get the exact reason but confidant its not legit.
-
RE: Brand Name Pulling Into Search Results Incorrectly
I saw the BBB and noticed it does mention it but couldn't figure if it was any real cause -
-
RE: Brand Name Pulling Into Search Results Incorrectly
Very Strange indeed META NOODP tag? It can help if Google decided it likes DMOZ more though you don't seem to have a link but its handy. You can also implement some Rich snippets - https://support.google.com/webmasters/answer/99170?hl=en to help Google know e.g the name of your site etc.
Hope that helps sorry i couldn't get to the bottom though another clever person here might be able to.
Good luck
-
RE: Hi, I'm looking to find out why a google+ account that was rarely used has 10,000 views. I want to discover what sites it is linked to. I entered the page url but no joy. can anyone help?
I believe you are trying to work out the views ( http://imgur.com/XXiyLxI ) those views do not mean that it has a lot of links it could mean its been shared one or old and thus viewed a lot. Unfortunately you won't be able to find out much on the views but you can find out how that account has been linked to by entering its url e.g. https://plus.google.com/+SEOmoz into OSE or Majestic seo or Ahrefs and TA DA! you can find the links but links have no real coloration with views
You can however use this site to find out shared etc. http://www.sharedcount.com/ e.g. http://www.sharedcount.com/#url=https%3A%2F%2Fplus.google.com%2F%2BSEOmoz%2Fposts
Hope that helps, if not let us know what you need and i'm sure we can hep
-
RE: No Google Analytics code on page BUT reporting is active
FYI an easier way is there is the following tool - http://spyonweb.com/ it will list the domains sharing the GA code.
-
RE: Moz Private Message Restriction
If I were to take a stab in the dark I would say most users do not need to send out more than two private messages a day. Many users will have their contact details (or at least a website) on their profiles I recommend exchanging email addresses.
-
RE: No Google Analytics code on page BUT reporting is active
Have you gone to your GA and admin>Tracking info >Tracking code. From here it will tell you if its working or not it will also tell you the code. if its working there somewhere on your site is the code its tracking.
Did you set it up? Can you also ask who did and find out how it was set up too?
-
RE: Open Site Explorer only showing 10 internal links, and 270 external links
Did you remember to change your filter from "this page" to "Pages on this root domain" ?
-
RE: Has Google Authorship been completely removed from SERPs?
I'm afraid it stopped way back at the end of August. You can see it here - http://algoroo.com/ & here http://searchengineland.com/goodbye-google-authorship-201975
ad finally here - https://support.google.com/webmasters/answer/6083347
John Mueller said -
"I’ve been involved since we first started testing authorship markup and displaying it in search results. We’ve gotten lots of useful feedback from all kinds of webmasters and users, and we’ve tweaked, updated, and honed recognition and displaying of authorship information. Unfortunately, we’ve also observed that this information isn’t as useful to our users as we’d hoped, and can even distract from those results. With this in mind, we’ve made the difficult decision to stop showing authorship in search results."
author rank is still a thing though so still worth using your G+
(just to edit the date was 28th August 2014)