Thanks for the quick reply Martijn,
I will 301 these back to the homepage. Just strange that Google is reporting these when they do not exist anywhere on the site.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks for the quick reply Martijn,
I will 301 these back to the homepage. Just strange that Google is reporting these when they do not exist anywhere on the site.
We have recently launched a new responsive website for a client and have noticed 2 "Not Found" errors within Google Search Console for /mobile and /m
Both these URLs are not linked from anywhere within the site. However Google is reporting them as being linked from the homepage.
This is not the first site we have seen in which Google has reported this error, however the other site was not a mobile friendly site.
My thoughts are to 301 them back to the Homepage. Anybody else have any thoughts on this? or have recently received the same errors?
To clarify
Its very difficult to obtain good links these days, generally they come as the result of creating good content, being a good brand and through a lot of hard work.
My suggestion would be to evaluate the sites you are dealing with, see what they are doing of interest to people, what they are doing which is different and try and exploit those areas. Promote that content through social channels, e-shots, good old fashioned word of mouth etc
Hopefully if the content is something that people find interesting, they will link to it, retweet and share it.
Admittedly, this is a simplistic overview but its the basis of what you should be doing.
If you are looking for further info, there is a ton of resource of Moz and other sites regarding content marketing, promoting content etc,
Hope that helps and good luck
Justin
If i hear the words Fiverr mentioned in a conversation regarding SEO, I run.
Don't do it....
Hi
It depends on your preferred domain going forward, in truth 25 linking route domains isn't enough to be concerned about (unless they are from very high authority sites), so I would chose the variant which works best for you and 301 to that version.
Going forward you should build new links which point the chosen variant (www or non-www).
Justin
We are going to (again) request they switch to Universal Analytics, although further tests show that both the google-analytics.com and stats.g.doubleclick.net events are registering in GA.
Therefore I'm more confused than ever as to the huge discrepancies between taboola and GA traffic, the same sort of discrepancy is also being shown on referrers such as Facebook visitors.
Any further suggestions/troubleshooting gratefully received!
The key thing here is whether the space is shown in the search results.
I think its unlikely that it would cause any SEO issues, if however the space shows in Google results there is a high chance it would affect your click thrus as extra spaces, weird characters etc in page titles tend to suggest that the site is less trustworthy (in my opinion).
So if space is visible in your page title and/or search results, get it sorted. If it is just tabs and line returns in the source code you don't need to worry about it too much.
Hope that helps.
Hey Mozzers
Recently a client of ours undertook a paid campaign using Taboola and have a vast difference between reported clicks and visitors reported by GA.
On further investigation the _utm.gif event is sent to google-analytics.com 50% of the time and stat.g.doubleclick.net the remaining 50% of the time
Is this likely to be the cause of the discrepancy and if so is it possible to resolve this without removing the remarketing tag (which seems to be responsible for stats.g.doublelclick.net).
Many thanks
Justin
I don't think the backlink profile to that page looks particularly natural and i think thats where your problem lies
OSE shows 19 backlines ahrefs shows 30, but looking through them, you have two infographitcs (with no social shares) a couple of links from .ac.uk (which are good links), directories and a number of very questionable looking links such as www.librerio.com/search/jobs/40
You also have some exact match anchor text
All of the above suggests to me that you have been hit by Panda/Penguin and the loss of links mentioned previously most likely suggests that you had questionable links that have been removed or the sites they were coming from have been shutdown (again a good indicator of poor backlinks)
To me, it looks like you are going to have to do some work on building some good links and adding a bit of content to your site to get your rankings back
Also you are on an exact match domain "graduate-jobs.com" which may be contributing
I'm sure its not what you want to hear, but I think thats where the problem most likely lies.
I hope that helps
Hi John
It appears to me that you are seeing in open site explorer "1 - 80 inbound links from XXX domains" is that correct?
If this is the case you may need to change the filter in the 3rd Dropdown from "this page" to "pages on this subdomain". Then click filter.
Now when you look at the links it should say "1-50 of #### external links" and should roughly tie up with the competitive domain analysis links.
Let me know if you get a problem.
Justin
What you are looking to do is reasonably common place.
Websites that contain a blogs, download sections etc are often hosted on separate webservers (and IP's), so what you are looking at doing should be fine.
You seem to have the major SEO consideration covered and that is to use subdirectories from the existing domain (www.oldsiteexample.com/marketing) that way your new part of the site will benefit from the authority/rank etc of the old site, also any authority generated by the new part of the site will be passed on in part to www.oldsiteexample.com further boosting its auth.
If however you use subdomains (marketing.oldsiteexample.com) you would pretty much be starting your seo efforts for this part of the site from scratch, furthermore very little (if any) authority generated by this part of the site would be passed back to www.oldsiteexample.com
I hope that helps
You could probably do something manually on this, although I have never tried it to be honest.
I tend to use the SEOMoz reports as they are easy, insightful and well formatted.
I would think the best way to achieve what you are after is to export the site rankings to csv and use Excel, set up conditional formatting to highlight a row where label is equal to "top keywords" or whatever label you use.
If you get any problems let me know and I will send you a quick example in Excel, the only downside of this is that you will have to run this report manually as there is no way of automating this info as far as i know.
Hi Daniel
It is possible to add labels to keywords within the SEOmoz PRO App, so you could add in the label "Top Keywords", but as far as I am aware there is no way to specifically highlight keywords or order the report by label.
You can order by top keywords, ranking improvements, declines etc but nothing that would give you the specific report you are after.
I would be a nice feature though, maybe worth suggesting it thru "request a feature"
https://seomoz.zendesk.com/forums/293194-seomoz-PRO-feature-requests
To add to Mike's answer
2: If the page is deleted and isn't coming back you may want to 301 it to its new equivalent of possible even return a 410 a status code to tell search engines the pages has been permanently removed
For more info on Status codes see the following article
http://www.seomoz.org/learn-seo/http-status-codes
Thats interesting, i've not noticed anything on any of my campaigns
Out of curiosity, does the list of duplicate URL's give any clues as the what may be causing the duplicate pages error? Could Moz have updated their crawler to be case sensitive or something along those lines ?
Agree with Toms comments
If you want to tidy up this error, go through the site and make the links consistent (ie change http://domain.com/page to http://domain.com/page/ throughout and that should solve the problem.
This is particularly worth while if you share the reports with your customers as customers hate seeing errors.
Hope that helps
Agreed, the point I was intending to make was that the WMT backlink info is not a definitive list, therefore it is not reliable resource of all backlinks.
I think it is a common mis-conception that because WMT is a Google product, the information in it must be 100% accurate, where as in reality that isn't the case.
Good point about the Majestic Fresh index, btw
I've never found WMT's a reliable resource for backlink info, often it does not show all backlinks also when backlinks are removed it takes some time for them to disappear.
How recently were the yellow pages and city search links added?
Definitely run the site through OSE, majestic and AHrefs for a better picture as William suggested.
A couple of other things you could check
Go to WMT -> Health -> Crawl Errors to see if any other pages are not found
Check how many pages are being indexed to make sure there are no sitemap issues etc
Hope that helps
Hi Gerd
Thanks for the reply and suggestions.
My biggest concern is not with the new site, I am more concerned with the impact this might have on the existing site.
My logic being that if we rel=canonical any duplicate content to the existing site we would minimize or even eliminate any impact.
Justin
Would re-skinning, duplicating an exising ecommerce website with a new domain name cause any ranking issues?
The plan would be that all product data, pricing info etc would be identical, the site would have a minor redesign to change colours, logos etc and all duplicate content would be rel=canonicaled to the original site.
In case you are wondering the reason for this is a customer with an existing site wants to try out a new brand without incorporating a massive development costs.
The majority of traffic would be driving through google shopping, a bit of PPC, social etc.
Is this site duplication likely to harm the original site or will setting up rel=canonical to point to the original site going to be sufficient enough to prevent this happening?
Is there anything else is should consider?
Many thanks for your help
Hi Virginia
I don't believe the URL's on the on-page report are embedded, at least they aren't when I run them.
I would imagine that your PDF viewer is recognising URLs and then making them clickable. My guess would be that your PDF viewer is causing the problem.
Acrobat is a reader that does recognise URLs and I have just tested on Acrobat Pro, Acrobat Reader on Mac and the URL's i tried clicked through correctly.
What are you using to view the PDF?
Justin
Hi
It definitely makes sense to clean up your domain list, an extensive domain list is often time consuming to maintain and frequently the majority of the domains deliver little or no value.
I would personally take the following approach:
1. Is the domain worth having and does it represent any value to the business?
A: yes - keep it
B: no - go to question 2
2. Does the domain offer any benefit to the site in terms of link juice/domain auth/page auth etc (I would check this thru Open Site Explorer, Majestic and Ahrefs if you have access to them)
Yes - Keep it
No - got to question 3
3. Is the domain potentially harmful to the site?
.xxx domains are indexed by Google, enter the following into the Google search bar "site:*.xxx" and you will see plenty of sites. My concern would be .xxx site are targeted to porn industry, if your website has no relation to this industry Google could decide to penalise you for this in the future, so i would be very tempted to ditch/park all the .xxx domains unless there is a good reason for keeping them.
Ultimately any decision to ditch domains should be taken with caution, as once you loose a domain, you can't get it back cheaply or easily.
I hope that helps
Justin
What is the server error you are getting with the .htaccess file?
You do indeed need to verify ownership of you site, it's real easy to do, it involves adding a small piece of code to your site. See http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35179
That solution works just fine.
It looks like the one from the SEOmoz "Redirection SEO Best Practices"
I've never had any problems with it.
Also to keep things tidy, be sure to tell Google Webmaster Tools that your preferred domain is **www.**domain.com
GWMT -> configuration -> settings -> display URL's as www.domain.com
Totally agree with Igor
Definitely use a subfolder if the intention is to gain backlinks
You might want to check out a whiteboard friday that Rand did a while ago that covers the merits of using subdomain and subfolders
http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
If i understand the question correctly....
Add in a rel=canonical tag to the page which your 'home' tab links to.
I would guess that for this to be showing as duplicate content that you may have some subtle difference in the url you are linking to eg www.domain.com, www.domain.com/, www.domain.com/index.html etc
Adding in rel=canonical will solve a whole host of issues when linking to the same page using different url's
See the SEOmoz best practices guide for a full explanation and examples of how to implement rel=canonical
http://www.seomoz.org/learn-seo/canonicalization
Hope that helps
Justin
It is best to use the domain.com/blog as the link juice and authority picked up by your blog will be passed in part to the root domain, therefore increasing you domain authority etc
If you take the subdomain route the link juice and authority doesn't get passed and the blog is treated as an independent sites in Google.
Therefore i'd go domain.com/blog every day of the week
It looks to me as if it is located in the new products panel on the left hand side towards the bottom of the page.
You are effectively linking the page to itself with exact match anchor text "Luxury Bath Pillows with Suction Cups, 4 embroidered designs"
I would assume that is what is causing the problem.
Yes, there is some truth in what your designer says, adding any additional images and media to a page will slow the load time, but that doesn't mean you shouldn't do it.
My advice to you would be this, if adding a slider to an internal page enhances the user experience (UX) then add it to the page, a well optimised image slider will not impact the load time of a page noticeably to a user.
Once your site is designed, you may want to consider running it through the Google PageSpeed tool, the tool will score your page and generate suggestions to help you make your page faster
https://developers.google.com/speed/pagespeed/insights
Domain authority is one of the most difficult metrics to influence.
Good quality inbound links from sites with good domain authority and mozTrust is one of the biggest factors in increasing domain auth, therefore you should consider the following....
1. Build good content, i'm sure you have read it a thousand times before on here and many other forums, but good quality content and being active through social channels is by far the best and most sustainable way of building inbound links and ultimately influencing domain auth.
2. Ensure you are listed with the "good" directories such as DMoz, Yelp, Yahoo etc, these are no where near as powerful in terms of link juice as they once were, but you should still be on them.
3. Embark on a traditional link building strategy, but ensuring you don't build spammy links and ensure you have anchor text diversity (ie don't try and keyword stuff anchor text)
4. Use tools such as SEOmoz's newly acquired followerwonk to find influencers in your industry who are likely to be interested in your site and connect and interact with them to see if you can get them to link/retweet your content.
Of the above the biggest gain for the least amount of work is likely to be item 2, registering with directories, but sadly there is no magic bullet for increasing domain auth, but then again, if there was, everyone would be doing it!!
You may also want to check out the SEOmoz explanation of "what is domain authority?" for an indepth explanation of how domain auth is calculated.
http://www.seomoz.org/learn-seo/domain-authority
Hope that helps
Justin
Hi Loughnan
Its a bit difficult to diagnose without seeing the pages in question, so could post the full URL and also an example of 2 pages that the SEOmoz web app is seeing as duplicate.
If you rather not put the link up on here, feel free to Private Message me.
Justin
The only consequence is that a 301 will pass most but not all of the link juice through to the new page.
It is quite common after site redesigns etc to have lots of 301's in place.
You may want to check out the Web Site Migration Guide blog post, it is extremely comprehensive and certainly worth a read before you embark on the transfer process.
http://www.seomoz.org/blog/web-site-migration-guide-tips-for-seos
Personally I would go for option 3, creating a /directory for the Canadian site, as the search weighting your root domain has will in part get passed onto the /directory as it is seen as another page on your root domain.
By going for a root domain or even a sub-domain you will effectively be starting your SEO efforts from scratch.
That said, there is some argument that a local domain would potentially rank better on local searches, however if you set up the directory route correctly by ensuring you have the localised address and phone number on the landing page, get inbound links to point to this page, social mentions etc that should to large degree counter that argument.
One other thing to consider is that all your SEO efforts on the /directory will help your domain authority and search weighting for the parent domain.
Rand did an great whiteboard friday a while back which explains the pro's and con's of the above options, so I would definitely check that out for a more in-depth explanation
http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Carl
If you check through Everett Sizemore's pro webinar about E-comerce SEO, he covers the correct way to set up of Megneto for ecommerce, from memory he also offers some additional resources you can check out. i think its around 45 mins in.
http://www.seomoz.org/webinars/ecommerce-seo-fix-and-avoid-common-issues
Hope that helps
My guess would be that at some point the url structure of the site has been changed and have 301'd from the old url structure to the new one.
Have you checked the .htaccess file on the server to see if there are any redirect code in there?
Another possibility is the platform could be causing it. What platform is the site created in? If it is an off the shelf e-com system then configs can cause these kind of issues, I believe magento is a culprit of this if its not set up correctly.
100's or redirects is no big deal from a server load point of view, so personally I would unless there was a good reason not too.
Assuming you believe the ranking factors of the existing pages to be good, I would transfer those factors to the new url by using 301 redirects.
Depending on the structure of the redirects you are doing, you may be able to use regular expressions to simplify the process.
You may want to take a look at the SEOMoz best practices article on redirection if you haven't seen it
http://www.seomoz.org/learn-seo/redirection
There was also the June/July 86 pack released on the 10th August.
Check out Search Engine Lands overview of the changes, they do a far better job of explaining the changes than I could
http://searchengineland.com/googles-june-july-updates-130392
It seems the updates focus on quality, the new algo has been updated to help users find "high quality content from trusted sources"
That suggests to me that quality of backlinks and possibly social shares may be factors, so in addition for checking items already mentioned by Oleg and Donnie, I would certainly do some backlink analysis against the sites that are now occupying the spots you used to.
Hope that helps
Personally I would set-up 301 redirects, the load on the server is minimal even on reasonably busy sites.
A 301 will transfer most of the link juice and authority the pages have.
Whereas if you don't redirect, any links directly to the blog articles using the old url's will return a 404, in turn the inbound links will be lost and you will pretty much be starting from scratch with no inbound links to your articles.
Redirect everytime for me, unless there is a good reason not to, such as Panda/Penguin issues
Hope that helps
If you noindex the short pages, Google will (or at least should) ignore them in search.
This is a commonly employed tactic for large E-commerce sites to their indexed product pages to allow them to concentrate their efforts on the more mainstream product lines.
Everett Sizemore covers this technique in his E-commerce SEO pro-webinar
http://www.seomoz.org/webinars/ecommerce-seo-fix-and-avoid-common-issues
Hope that helps
Your best bet is to do a 301 redirect from the old domains to the new one.
A 301 will preserve most but not all of the link weighting etc from the old domain to the new one.
Before you redirect the domain, I would personally check it thru Open Site Explorer and if the domain has a questionable back link profile you may consider not redirecting it to the new site as as well as a 301 will take over the good as well as the bad.
If the site structure and URLs on sub pages change you may want to consider doing page by page redirects (ie olddomain/page1.html -> newdomain/pagexx.html) depending on how important or valuable the existing pages were to you in the SERPs. Page by page is the best route although much more time consuming than domain level redirects.
SEOmoz have a best practices guide on redirects which covers pretty much all you need to know
http://www.seomoz.org/learn-seo/redirection
I hope that helps
It will almost definitely solve the issue.
If it is any use to you, the approach I would typically take is to concentrate on 10 products and make the content changes to those pages.
These can be top 10 sellers, top 10 products in terms of value, 10 random products etc, re-write the descriptions for each of these and make them live.
I would then run a test crawl to ensure the pages are no longer seen as duplicates, then wait for Google to pick up the changes and see what effect it has on rankings and indexing.
Assuming this experiment is successful start working thru the rest, I find that concentrating on 10-20 products at a time works better for me as its more actionable and easier to work on a smaller product subset.
I hope thats of some use to you
Yes I use it quite a lot, its a fantastic tool.
There are many ways you can use it to grow followers, but a couple of the more straightforward ones are
Analyse your followers against your competitors, you can see which followers you have in common and more importantly people who follow them and not you
You can search users bios based on interests, so for example if you were a clothes retailer you could search for location & fashion to find potential local followers
There is a whole host of other features that allow you to find how influential your followers are, how many followers they have etc.
So all round its a real useful tool and it is extremely powerful
Rand does dig a big deeper into the features in his acquisition announcement, so that is worth a read if you haven't done so already
http://www.seomoz.org/blog/seomoz-pro-member-you-now-get-followerwonk-free
Hi Matthew
In Googles eyes these would be duplicate pages as 99% of the text on the page is almost identical.
Google takes a pretty tough line with this, so re-ordering a few words here and there wouldn't solve the issue, to solve the issue you really need to write unique content for each product.
You may want to consider including the product description in the area next to the picture as Google tends to put more significance on text which higher up the page.
Hope that helps.
You do indeed have 2 rel=canonical links on that page.
It looks as though Platinum SEO plug in is adding one in (its situated withing the
I think your suspicion about the validity of this link answers your question.
Unless there is a good reason to keep the comment/link, i would delete it as it all sounds suspicious from what you have described.
Subdomains generally don't pass any authority, link juice etc to the TLD, Rand did a Whiteboard Friday that briefly covered this a while ago (see http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday)
I am curious, if you didn't want user created sites to be associated with your TLD why didn't you set up a different domain for user created sites?
I personally think it is morally wrong to try and stop Google indexing them. So, if you don't want these associated with you or your TLD I would set up a new domain eg yourbreezi.com and 301 any sites that have been set up to the new domain and make sure that any new user sites are set up under the new domain.
In truth I'm not sure it is too much to worry about, after all Wordpress.org uses subdomains for most of its hosted blogs and it doesn't seem to have done them too much harm!!
Hope that helps
From my understanding of what you are saying this is neither a problem with your homepage or opensiteexplorer.
Opensiteexplorer (OSE) is reporting the links that it finds to your domain/subdomain, therefore if an external site has used the href of http://www.yourdomain.com/index.php that is what OSE will report, as it will also report any other variations yourdomain.com yourdomain.com/ etc
OSE is purely showing you the links and hrefs that are pointing to your domain, therefore if you have a 301 or rel=canonical on your homepage you are dealing with this correctly.
So in summary, OSE is reporting the different hrefs people are using to link to your site and your homepage is redirecting any incorrect links to the correct place, so all is ok.
Hope that helps
It may be worth sharing the web address in question and the type of errors SEOMoz is finding.
The Moz community is a thriving place of SEO experts who are very willing to offer advice to help fix your problems, it might be worth a shot as it could save you some dollar and also expand your own knowledge
If you decide to go this way, it would be better to use subfolders instead of subdomains, as subdomains don't pass on link juice to their parent domain, where as sub folders do
so redirect to:
http://www.miamicabinets.org
to