That´s great - but if I want to target more than just the country but anyone who speaks English or Spanish - eg., not just UK or Spain for example?
Would it still be wise to do it?
thanks!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
That´s great - but if I want to target more than just the country but anyone who speaks English or Spanish - eg., not just UK or Spain for example?
Would it still be wise to do it?
thanks!
What about using name="Language" content="en"/> ?? or es as the case may be?
Guys - both of your answers are very helpful.
Our site is translated by humans native to spanish and english - however, our market reach is not just UK or Spain, but any english speaker or spanish speaker - there are so many different countries that speak both.
Thanks again!
I have tried using them and didn´t do anything - furthermore, if you check this video out by Google themselves, you will find that using these parameters is a "hint/suggestion" as opposed to a solid directive.
http://www.youtube.com/watch?v=DiEYcBZ36po
Rel Canonical is also a hint.
But Meta Noindex,follow is a solid directive which they have to pay attention to.
Hope that helps - been there, done it got the t shirt through a lot of pain and frustration!
How does Google view this?
Our current site works like:
www.domain.com/EN - English
www.domain.com/ES - Spanish
All products are the same, just different language and different URL for them - is this good or bad?
I thought of either
Any advice appreciated!
Hi,
Our ecommerce store - we have had some duplicate content issues and they have been corrected, but of course, Google takes time to pick up on these. Our link profile is very poor, so we wont lose a lot by going to a new domain in that sense.
My question is, in what instances is it worthwhile starting under a new domain? And in which not?
Presumably you can also 301 the whole site - when is it worth doing this or not?
Thanks,
Ben
Hi Ian,
You are right in that Yoast Meta Robots can be cranky - I installed it and had to play around with it to get it working.
However, it does offer a very nice feature that I think is worth it - you can apply various combinations of Meta Robots directives to product pages individually - so this adds more value than just being able to do NOINDEX on reviews, wishlists, etc... pages. But install it on your dev site before trying it live.
So my solution uses both Yoast and my custom code - you check the URL for any querystrings, such as ?manufacturer etc... and apply different logic according to what you wish to be indexed or not.
Feel free to PM me.
This is the robots file line:
Disallow: /index.php/
Now, Magento has custom URL rewriting - so no URL uses index.php although this is the file that is used to run the system itself.
Should I unblock it?
Thanks - I thought so, just wanted to double check . Cheers
Hi,
We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message:
Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
</a>Is robots.txt blocking important pages? Some important page is blocked by robots.txt.
Now, this is the weird part - the page being blocked is the admin page of magento - under
www.domain.com/index.php/admin/etc.....
Now, this message just wont go away - its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ??
Any ideas?
THanks
Hi,
I am quite familiar with Magento and struggling with the SEO of this ecommerce mammoth!
As far as I am aware, you should implement the meta tag "NOINDEX, FOLLOW" on those pages that you do not want indexed - as your pages are already in the index, this is the way to go - blocking them on robots.txt does not get pages out from the index if they are already in there.
I suggest you apply some "querystring" logic to your template - you will find the page here:
app/design/frontend/default/YOURTEMPLATE/template/page/html/head.phtml
That way, you can apply the
depending on the page content.
Hope this helps you and let's stay in touch about Magento! (PM me)
Presumably submitting an english content based site is pretty pointless?
Hi,
One of my customers asked me to get them included in major russian search engines - I am assuming that Yandex is the biggest one, and Google.ru - any others?
Any points on how to get a site included with them?
thanks,
Ben
Hi,
I am just double checking to see if these parameters are ok - I have added an attachment to this post.
We are using an e-commerce store and dealing with faceted navigation so I excluded a lot of parameters from being crawled as I didnt want them indexed. (they got indexed anyway!).
Advice and recommendations on the use of GWT would be very helpful - please check my screenshot.
thanks,
No probs.
No, Magento URL Rewrite Module will not do the 301's for a mass change like going from category paths to just product paths - so you will be left with a lot of 404's to clear up. The other thing is that your individual product URL's, have no keywords in them whatsoever, except presumably the SKU's - so I think changing it may affect you adversely.
How many products do you have? 1200 hits a day is not bad going...
Hi Thomas,
I'm familiar with Magento and familiar with SEO.
If you go to the magento admin panel, and go to System > Configuration > Catalog > Catalog > Search Engine optimization, there is an option for:
Use Categories Path for Product URLs (which you have set to YES right now) and also this: Use Canonical Link Meta Tag For Products - which you may have set to yes o no. Enabling this last option tells Google that Product A is available under different URL's but it's the same page. This is how I had my setup. However, the rel canonical implementation is just a suggestion to search bots, not a directive, and from what I have read, when you have the same page appearing under 4 to 5 different categories, it is really down to Google and other bots to agree with you on the canonical.
So, I decided to set "Use Category Path for Product URL's" to NO - this means that your product will only be viewable under http://www.electromarket.co.uk/bba0051 which gets rid of duplicate content issues - you do lose the keyword rich URL's created by the category paths, but in the end I chose this option.
IMPORTANT: If you change that setting now, it will create a lot of 404's for you, as you probably have those indexed already in Google - this is a decision that in the short term creates a lot of work, but longer term I think solves a lot of problems.
Hope that helps,
Ben
Hi,
Sometime ago, a few thousand pages got into Google's index - they were "product pop up" pages, exact duplicates of the actual product page but a "quick view".
So I deleted them via GWT and also put in a Meta No Index on these pop up overlays to stop them being indexed and causing dupe content issues.
They are no longer within the index as far as I can see, i do a site:www.mydomain.com/ajax and nothing appears -
So can I block these off now with robots.txt to optimize my crawl budget?
Thanks
Hi,
In the case of a RED/GREEN/YELLOW coffeemaker for example, I have say 6 pages that are indexed in google.
Now, I can write very unique content for each and that gives me 6 pages in SERPS. Or make it a configurable product?
What is best, and how different would the description need to be - my feeling that just changing the word colour in the text would NOT be enough.
Thanks,
B
Good point Chris... I know, old school - refreshing my skills!
Going to remove the boldening, just needed to know if it could be causing a penalty or not.
thanks,!
It used to work for me on some sites - but maybe it's considered spammy these days?
Any feedback appreciated.
Thanks.
Our landing pages are both - depending on google - we rank for some category pages and rank for products also - the content and keywords are sprinkled without overspamming in the content, so it should make semantic sense what our pages are about.
I think I really need to work on the nav - I think link juice aint flowing.
--- your product descriptions are duplicated on other sites and your site has been hit by Panada or the age-old duplicate filter (YES, they are - We are writing new content that is not Manufacturer description copies - e.g., our own unique content - in the meantime, I am META NOINDEX, FOLLOW)
--- you are working on a nascent site that has no links and that nobody is tweeting, liking or citing (Partially correct - we are tweeting and liking it via various social network accounts of our own)
--- you have a bad link profile and have been hit by Panda
(I cant see any "bad links" from unreputable domains - Our link profile is pretty poor and we only have a couple of external references from other websites and nothing too spammy)
So, what do you say about the three items above?
Also, those 70 visitors. What are their referring domains, or where are they coming from? We are getting exposure in Google, so mostly traffic coming from Google but not enough
If we were hit by a Google penalty, what would it look like?
Hi everyone,
I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that.
We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts.
Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language.
Then on the homepage, we have the big MEGA MENU - and we have
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3
Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc...
Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS
Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above.
So once the bots hit the site, immediately they have this structure to deal with.
Here is what stats look like:
Domain Authority: 18
www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90
www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54
www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54
Product pages themselves - have a PA of 1 and no mR or mT.
I really need some other opinions here - I am thinking of:
But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice?
Thanks all,
Ben
Thanks Mike!
Would love to fill out that spam report - but I think I'm gonna go with the solution below - the domain in question has a lot of natural links, but no keyword specific links in directories - whilst the other site is built solely on directory links -
Hopefully a few directory links will push the balance in our favour!
Thanks Takeshi!
I guess my next question is how do I obtain authoritative links? Thanks!
Since Google's latest updates, I think it would be safe to say that building links is harder. But i also read that Google applies their latest guidelines retro-actively. In other words, if you have built your ilnking profile on a lot of unnatural links, with spammy anchor text, you will get noticed and penalized.
In the past, I used to use SEO friendly directories and "suggest URL's" to build back links, with keyword/phrase anchor text. But I thought that this technique was frowned upon by Google these days.
So, what is safe to do? Why is Google not penalizing the competitor?
And bottom line what is considered to be "unnatural link building" ?
You probably know what I mean, the report in Google Webmaster Tools > Sitemaps.
So how do I locate the pages that are NOT indexed?
Thanks,
Ben
Hi,
I am just wondering as to the accuracy of this report - does it pick up all the duplicate on page content? Or is there a limit?
We have an ecommerce store with a lot of copied and pasted descriptions - just wondering if there is a limit on how much the moz crawler picks up? In other words, once we fix what MOZ has detected, will there be more detected because it is limited to display say up to 200??
Hope you understand what I mean.
Thanks
Hello everyone on the new cool Moz!
I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level.
Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages.
Does this influence the ranking possibilities?
Your opinion and time is appreciated. Thanks in advance.
OK.
I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level.
Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages.
Does this influence the ranking possibilities?
Your opinion and time is appreciated. Thanks in advance.
Thank you to everyone who helped clarify this issue.
I thought about becoming an editor, although, from what I read, the SEO value of having a DMOZ link may not be as strong as it once was...
Great info!
Thanks @Baldea - yes, you are correct, as soon as it processes the requests for removal it allows for more to be added for removal.
In "one sitting", I managed to get 1000 URL's removed - within a few hours I could process more.
Hi,
How long does it take for DMOZ to process a suggest url?
Thanks,
Ben
Thanks Baldea....
Yes, I have done all the above, but still got some pages stuck in Google Index that got in there - now measurements are in place to stop that happening in the future.
But using batch process of GWT to remove the URL's that already got into the index.
So how many URL's can I remove with GWT?
I'm dealing with thousands of duplicate URL's caused by the CMS...
So I am using some automation to get through them -
What is the daily limit? weekly? monthly?
Any ideas??
thanks,
Ben
Very good answer Elias. Thank you.
Thank you to everyone that contributed.
@Zeph and @Francisco - I do use Screaming Frog, but actually, correct me if I am wrong, but it does not show a list of pages indexed, but rather pages that exist in the site - not what Google has already indexed. Thanks anyway
What I wanted was a way of creating a list of all indexed pages in Google - not a count.
But thank you all the same!
Hi,
We have an ecommerce site with around 4000 real pages. But our index count is at 47,000 pages in Google Webmaster Tools.
How can I get a list of all pages indexed of our domain? trying to locate the duplicate content.
Doing a "site:www.mydomain.com" only returns up to 676 results...
Any ideas?
Thanks,
Ben
Hi,
I have a lot of near dupe content caused by URL params - so I have applied:
How long will it take for this to take effect? It's been over a week now, I have done some removal with GWT removal tool, but still no major indexed pages dropped.
Any ideas?
Thanks,
Ben
Thanks.... was starting to wonder whether I needed a space or something...
I have this without a space:
with a space:
Shouldnt make much difference but as it's taking time for Google to notice just curious...
Hi,
Is this syntax ok for the bots, especially Google to pick up?
Still waiting for Google to drop lots of duplicate pages caused by parameters out of the index - if there are parameters in the querystring it inserts the code above into the head.
Thanks!
Hi,
Our on page optimization, albeit for a few dupe content issues, is ok - We have good keyword rich URL's, Titles, H1's and unique product descriptions.
So now I want to look at building links that will boost our DA and PA's.
We have over 2000 products on the store and around 130 categories/subcategories -and I would appreciate any views on where to start -
My initial view is to get backlinks from the relevant manufacturer websites to the "shop by brand" page on our site related to these manufacturers -
What other strategies should I look at?
Thanks,
Ben
Hi,
Any simple methods to boost Domain Authority?
Thanks,
Ben
Hi,
Our sitemap is created by our e-commerce software - Magento -
We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap.
So, the question is:
"Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter?
thanks!
Hi,
So, we know we don't have the best content - so we are hiring writers to create unique content for each product.
What happens if this is now copied by another website? What does Google see? Do they recognize us as the original content?
Has anyone used DMCA.com ? is it worth it?
thanks,
Ben
Hi,
Our e-commerce store has a lot of product descriptions duplicated - Some of them are default manufacturer descriptions, some are descriptions because the colour of the product varies - so essentially the same product, just different colour.
It is going to take a lot of man hours to get the unique content in place - would a Meta No INDEX on the dupe pages be ok for the moment and then I can lift that once we have unique content in place?
I can't 301 or canonicalize these pages, as they are actually individual products in their own right, just dupe descriptions.
Thanks,
Ben
Thanks Tom.... Very valuable info.
A bit of background: we are running an e-commerce site - so I'm dealing with various forms of dupe content:
Anyway, slowly fixing these - just seems like Google is painfully slow to respond to our work on improving things. Hence the question...