This will be an issue as you are linking to non existent pages and Google will continue reporting 404 errors in G-WMT.
It would be worth your time to do it accurately one channel at a time.
Greg
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
This will be an issue as you are linking to non existent pages and Google will continue reporting 404 errors in G-WMT.
It would be worth your time to do it accurately one channel at a time.
Greg
Agreed. I have just run the tool and its identified a bunch of links we should consider removing.. I am impressed as it has saved allot of analytic work identifying these dodgy links.
Hi Craig,
I would suggest looking for guest posting opportunities with webmasters in your niche (or similar niche) rather than publishing on Article/Blog directories.
1.) Always only publish your articles in one place, don't try to mass submit the same article to many websites.
2.) You can publish the article on your website, and then syndicate the article on other websites, but this isnt the best idea either. If you decide to guest post, the webmasters usually check to see if the content is original, if not they wont accept it. Even if they do accept it, the link pointing to the original article on your website is merely a "reference" rather than an endorsement from one site to the next.
Hope this makes sense?
http://www.seomoz.org/seo-toolbar
If you install that toolbar, you can check to see if links are follow or no follow.
Social Media is correct. I never checked the site, but the links are marked as No-follow so it wont help with your rankings..
If you are wanting to publish an article, I would suggest rather asking a webmaster in your niche to publish it on there website in the form of a guest post.
Do some research here for info on guest posting. Once you have a good understanding, type this in Google for guest posting opportunities.
"Keyword"+“Guest bloggers wanted” OR “guest blogger wanted”
Hi Craig
Keep it structured in a silo by displaying your parents>children>Grand children and cousins and try not to include exact matches of children or parent keywords on the product pages meta tags.
An example would be.
website.com/shoes/ (this is the parent and lists all the children)
website.com/shoes/blue-shoes/ (this lists all the grand children)
website.com/shoes/blue-shoes/red-laces/ (this is the end of the hiarachy, but you can list related products (tap dance shoes etc)
Back to your question...
You want to keep it structured as best you can to keep Google happy, so try your best not to "canabilise" your pages (more than 1 page targeting the same keywords)
In your scenario, the http://www.towelsrus.co.uk/kitchen-oven-gloves/aztex/barbeque-boss-double-oven-glove-black_ct389bd182pd1968.htm is the end of the line (the grand child) so you dont want this page ranking for generic terms like "black oven gloves" (the page before this should rank for that) etc..
Search for keywords specific to the product, and if you cant find any, then just use the product name and or guess what people would search when looking for that specific product rather than using a generic term as this will cause more hassles than good.
Make a note of the keywords you find that fit Parent and children pages and optimize those pages for the more generic terms.
Make Sense?
Greg
Hi Craig
Simply put, yes you will get the "trust" signal by getting a link on that website.
However, because the page you create has very little authority, there is very little "link juice" that passes onto your website (The stronger the page, the more ranking power you receive)
Get the best of both.. Publish the article, and then build a few back links to your article. This will improve the article pages Authority (page rank) and will in tern filter it into your website..
Just make sure the article is not specifically about the products you sell or it may outrank your website when you promote the article
Greg
Hi Mathew.
Cemper.com Tools are amazing (link Research Tools)... The most advanced Link building tool out there in my opinion...
Chris Cemper has been working on the link detox for a while now, and has explained how to manually find the suspcious links without using the detox tool, but the tool just makes it easier for us.
I have not tried it yet (was only released yesterday) but his advice on identifying link networks and other dodgy links is invaluable.
For the price of a tweet, its well worth giving it a try..
Greg
Yes, these pages are seen as duplicates.
Yes, Redirect B to A
You could also create a rule somehow to redirect or add the / to all URL's without a trailing slash just for piece of mind that you wont have to do this again.
Greg
Hi Ben.
Now more than ever Google is focusing on Brands rather than standard "keyword rich" websites.
Brands offer trust more than anything and is by far the best way to go if you plan on building a long lasting online business...
"Budget Car Rental" is a real brand, but then its also a money making term. Google has clearly recognised the term as a brand which is great and if your client can do the same, then no harm done.
It gets manipulative when your brand is "ABC car rental" and your website is "Budget car rental.com"
The only reason exact match domains are still ranking I presume is because Google is still tied up in identifying which are brands, and which are there solely for ranking benefits. Google's not stupid and things are changing fast.
I guess the best way to explain it to your client is to say:
For short term ranks with no guaranteeing on future success, then choose a URL with the sole objective of manipulating Google into ranking you well.
Or
Choose a brand name and built trust that will last for years to come.
If they choose to go the manipulative route, then at least you have done all you can to educate them leaving you with a clean conscious for when they regret the decision made.
Just my 2c worth...
Greg
No no...
Don't redirect all your links, just do it once to one URL
Redirect To
Once you have redirected, jonnyt.me/index.html wont load, but instead will redirect to jonnyt.me
If you know of any links that are pointing to "jonnyt.me/index.html" change them to point to "jonnyt.me "
If you cant find any, dont worry about it, when google or a user clicks on jonnyt.me/index.html, they will be taken to the correct page.
I hope this makes more sense.
Greg
That makes it even more complicated... Well spotted Martijn
Based on were the majority of your links are currently pointing, you need to choose which to go with and redirect the other.
These are duplications which you need to sort out. (I suggest redirecting jonnyt.me/index.html to jonnyt.me/)
Greg
Hi
You do in-fact have internal Followed links and its just a matter of changing the "pages to" field in OSE
Dont ask me why, but when choosing "to this page" nothing gets displayed. When you choose "pages on this root domain" all internal links are displayed.
The selection that works includes all variations of URL's that point to jonnyt.me and not just URL's pointing to jonnyt.me in that makes any sense..
Show > ALL > from > ONLY INTERNAL > to > PAGES ON THIS ROOT DOMAIN >
Greg
Thanks, but more specifically, i want to receive "unnatural links to your site" type messages that i think are not being disclosed by the original Administrator.
Otherwise i have access to crawl errors etc.
Thanks Rob
Greg
After being added as an administrator of a G-WMT profile, there are no messages from Google in the profile, however the first administrator has a number of messages.
Is it possilble to request Google resends these emails so that I get them as well?
I have looked around, and assume its not possible, but was wondering if someone had experienced this before and knew of a workaround.
Greg
I second what Maximise has said.
Going forward, start promoting your websites brand rather than its keywords. Concentrate on obtaining quality links with the anchor "GMR transctiption Services" or simply GMR transcription services.com
Have a look at your backlink profile and remove site wide links or atleast vary the anchor text were you can.
Your back link profile Ideally should be made up of about 70% branded terms, and 30% "money terms" otherwise it looks un natural to Google. (Generally speaking, its not an exact science)
Greg
I haven't noticed any problems on my side....
In terms of SEO, any CMS can be optimised for SEO /CRO benefits so it essentially boils down to the size flexibility and style of the website you want to set up.
In my experience, Expression engine is great if you have developers and graphics designers working on your site. (which would be built from the ground up, custom templates etc)
If you have your standard bloggers with little HTML/programming/Graphics design skills, wordpress is the way to go as its the most user friendly platform in my opinion. (Free and paid templates available for download)
As mentioned, any CMS can incorporate SEO/CRO best practices, its up to the type of website and the skills you have at your disposal.
Diane, try contact the author and request that they reference the original source and threaten that if they don't, you will report them to Google.
You can file a DMCA request with Google to remove pages that are stealing your content.
Read here for more info: http://www.ecreativeim.com/blog/2011/12/report-stolen-content-to-google/
Dana, I just want to chip in on Point 1..
Ignoring your 404 pages is fine. But make sure to remove all links on your website pointing to the 404 page.
When google bot crawls your pages, and finds a link to a 404 page, it doesnt care if the 404 page is optimised for users, It's still a 404 so G bot wont be happy.. With these 404's you also get duplicate content/Titles and meta descriptions as there are many of them.
In other words, be sure to removal all links pointing to 404 pages. Link Sleuth makes this easy..
Greg
Unless your talking about the "show links ungrouped" option...
This will show the top 25 links per domain, and your own domain always comes up first.
Greg
Incorrect!
The "Top Pages" tab at the top just points out the most valuable and strongest pages of your website..
This tab doesnt report on links, it just reports on your best pages.
Greg
I agree with Alice.
I ignore most link exchange requests, however i do find the odd website with great authority and rather than simply exchanging links on links pages, I try to to discuss and negotiate exchanging links via guest articles or editing an existing (relevant) piece of content with our link.
This doesn't happen often though, but its sometimes worth analyzing each request to find the gems. (Real webmasters promoting their REAL websites, and not some agency creating websites just for link exchange)
Greg
Hi,
You do need to be a pro member to see/download all incoming back links
I checked and found 10 000 incoming links to www.comm100.com/livechat
Im guessing you want to change the link to point to http://livechat.comm100.com/index1.aspx?utm_expid=19023959-0 ??
If this is the case, the link juice will still flow via the redirect. (not as much, but the value from these links are still being counted)
Greg
Yes it does effect your rankings.
I cant say how much, but is definately a factor that Google considers when ranking your pages.
Make sure you are using a reliable host as not only Google gets mad when pages become unavailable.
Greg
It depends on the content in my opinion.
It would be ideal If you could cover a topic and conclude with a max of 2000 words but If you cant, I would break it up into parts or chapters that you could maybe publish over a few days .
There are pro's and cons to both, but In my opinion, keep it to a max of 2000 words per page.
You only need to create 1 campaign for the website.
Roger bot will crawl all your product pages and report on any technical issues you have with the pages.
When checking for ranks, SEOMOZ checks the pages that are ranking as well as your domain so you dont have to assign different keywords for different pages in your campaign settings.
Dont worry about creating multiple campaigns, just load your website URL, insert the keywords for all your product pages, and wait for a report at the end of the week.
Greg
There's nothing wrong with Legitimately linking to relevant websites (yours or not)
SEOMOZ for example.. They link to all of their properties from here...
The only thing to look out for is site-wide links, or more links then what would be regarded as "natural"
Also, make sure your websites are all on unique C-class IP's or Google wont value the links from your other sites as much as it would if they were on unique C-class IP's.
If 10 sites have the same C class IP, they are most likely owned by the same person, therefor Google places less value on them
Greg
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Creating links on different websites is the key... Submitting multiple articles to the same websites/Directories wont have as much of an effect as if you submitted to other websites.
I have also heard that links in article directories and web 2.0 sites are devalued by Google as they are easily "created" and not "Earned"
If i was you, I would search Google for websites looking for guest authors and submit your article to those websites.
Search this in Google and you should get a few websites to submit your articles too.
"keyword"+“Guest bloggers wanted” OR “guest blogger wanted”
Greg
Open the links up and have a look at the page.
Is it relevant to your website?
Do the other links on the page point to unrelated websites?
Install the Mozbar on your browser and have a look at the Domain / Page Authority with the number of links pointing to the website.
I wouldn't worry about it if they appeared naturally. A link is a link. Although it may not be as valuable as a link from an authoritative website, its still a link you never had before.
I would only worry about removing them if the website is clearly a "spam" website, otherwise dont stress and keep getting links on relevant websites.
Greg
Yip, thats what i thought.. .Thanks David..
I don't feel confident about doing this, so will try convince the the powers that be to agree.
The advice to do this was given by a reputable SEO, so if convincing the PTB that its not such a great idea doesn't work, we'll just test this process with a few pages of less importance and see what happens.
Thanks again..
Greg
We are currently in the process of creating new pages for keywords very similar to our main landing pages.
Example:
Page 1.) "Family Holiday in {Location}"
Page 2.) "Family Vacation in {Location}"
Page 3.) "Family get-away in {location}"
All 3 keywords are currently optimised for the 1st page "Family Holidays in {Location}" but was wondering if there is any benefit or if its worth creating sub pages for these alternative keywords.
I personally don't think we should do this as Google will know to rank our main page for secondary/synonym keywords via on page seo and links with anchor text, but can you foresee any negative aspects if we went ahead and created these children pages being linked to from the top level keyword's page?
Or do we continue optimising our main landing page for related/synonym keywords?
Thanks in Advance
GREG
The more linking Root Domains you have, the higher the authority grows.
Obtaining 100 links from the same websites is no were near as valuable as 100 links from 100 different websites. This will improve your Domain Authority.
Greg
I suggest you state your case in a Google Reconsideration request.
Submit all the URL's you removed and detail all the steps you took to fix up the link profile and on page errors.
Once you've done that, don't spend you time worrying. Start outreaching and get some authoritative links back to your website with your brand name. Just keep at it until you start recovering.
Good luck!
Its pretty big
Over 1000 Pages in the index, and many more internal URLs to crawl that have a no-index tag. (booking forms etc)
Ill see if we can archive our other campaigns and let roger crawl our main site properly.
If your going to 301 the URLs, be sure to 404 them first otherwise you wont be able to send a removal request in G-WMT
To remove a page or image, you must do one of the following:
GREG
Hi,
Whats Paywall?
If its a do-follow link, and the page is indexed by Google (not hidden from Google), then it will pass value.
GREG
Hi,
These pages exist and are exact duplicates according to Google. You said you stopped generating these URL's over year ago?
Why not just 404 all children URL's?
Once done, you can either request them to be removed from the index, wait for them to disappear or you could 301 them if the children URL's are still getting traffic.
Just make sure that you have no links internally pointing to any of the children URLs and Google should forget about them,
Once your done with all the above, you can mark the errors as fixed, and Google should stop reporting them,
GREG
The plugin Just makes filling out the Meta titles and descriptions easier and provides more technical SEO options for your pages/posts.
With regards to your archive pages, It wont change any Meta or URL info on existing posts, It will just point the Meta and Title etc when you edit the page.
I use Yoast on our WP sites, it makes editing pages for SEO benefits easy.
GREG
Thanks Nakul,
I do a weekly scan with Xenu which doesn't have a URL limit like SF.
I was under the impression a full scan of the site was done each week, but as you say, its being scanned in chunks, divided across our 3 other websites.
If this is the case, it would be great to let Mozbot know were to crawl to avoid unnecessary resources being used up when it could be scanning our most important pages.
Greg
Hi All,
Rogerbot has been reporting errors on our website's for over a year now, and we correct the issues as soon as they are reported.
However I have 2 questions regarding the recent crawl report we got on the 8th.
1.) Pages with a "no-index" tag are being crawled by roger and are being reported as duplicate page content errors. I can ignore these as google doesnt see these pages, but surely roger should ignore pages with "no-index" instructions as well? Also, these errors wont go away in our campaign until Roger ignores the URL's.
2.) What bugs me most is that resource pages that have been around for about 6 months have only just been reported as being duplicate content. Our weekly crawls have never picked up these resources pages as being a problem, why now all of a sudden? (Makes me wonder how extensive each crawl is?)
Anyone else had a similar problem?
Regards
GREG
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU)
But what if you have lots and lots of JS and you dont want to waste precious crawl resources?
Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc.
And the legacy versions show up in Google Webmaster Tools as 404s. For example:
http://www.discoverafrica.com/js/global_functions.js?v=1.1
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1
Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether?
Isn't that what robots.txt was made for?
Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks.
We're just trying to power our content and UX elegantly with javascript.
What do you guys say:
Obey Matt? Or run the javascript gauntlet?
Thank you Tommy,
I will keep at it until it changes.
Thanks for the advice.
Regards
Greg
Hi All,
Out of all our keywords their are 3 that are showing our home page in the serps rather than the specific product page URL on Google.co.za (Google.com ranks the correct URL)
Im not sure why this is happening as most links built using the anchor text are pointing to the correct page. Why would google prefer ranking our home page on local search and rank the correct page on Google.com? (only 3 keywords have this problem)
I have tried to correct this by creating links from strong internal pages with anchor text pointing to the correct URL. I have also concentrated on building links from .co.za domains using the anchor text and correct URL but to no avail.
It has been 2 weeks now, since i tried to sort it out, but im not sure what else i can do to tell Google to rank the correct page.
Any ideas?
Regards
Greg
As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ.
The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google.
Your comments will be appreciated.
Regards
Greg
Thanks chaps for your great responses. I will probably go for a combination of option A and option C. If I can justify redirecting to the home page or another page on the new site I will. Otherwise I'll let the page die a 404 death (rather than daisy-chain redirecting it).
Hi,
I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates.
The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date.
The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links.
Recently I set up a swanky new branded website on ExpressionEngine on a new domain.
My intention is to:
Sounds good, right?
But there is one issue I need some advice on...
The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site.
I call these my "black sheep pages".
So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death?
OPTION A:
oldsite.com/page1.php -> newsite.com
oldsite.com/page2.php -> newsite.com
oldsite.com/page3.php -> newsite.com
oldsite.com/page4.php -> newsite.com
oldsite.com/page5.php -> newsite.com
oldsite.com -> newsite.com
OPTION B:
oldsite.com/page1.php -> oldsite.com
oldsite.com/page2.php -> oldsite.com
oldsite.com/page3.php -> oldsite.com
oldsite.com/page4.php -> oldsite.com
oldsite.com/page5.php -> oldsite.com
oldsite.com -> newsite.com
OPTION
oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
oldsite.com -> newsite.com
My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique.
What would you do?
Help
If you are competing with this guy, dont replicate what he has done. Prove to google your site is more popular by getting even better quality links. Google wont have a choice but to rank yours if your link profile is more authoritative.
GREG