Large websites use meta description templates for thousands of product pages and change only one or two variables in the sentence, for example product name and category name.
You will be fine with those two similar meta descriptions.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Large websites use meta description templates for thousands of product pages and change only one or two variables in the sentence, for example product name and category name.
You will be fine with those two similar meta descriptions.
According to the updated Google webmaster guidelines (Jan 2016), tabbed or not immediately visible content will have even less value than previously.
"Make your site's important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view."
Summary of changes here: https://www.seroundtable.com/changes-in-the-google-webmaster-guidelines-21551.html
Firstly, use a website crawler (e.g. Screaming Frog) and check that every page on the website has Google Analytics tag, and also make sure it's implemented correctly.
Then I recommend reading this article, which explains the difference in more details: http://www.lunametrics.com/blog/2015/08/05/google-search-console-clicks-vs-google-analytics-sessions/
Three common reasons when there are more clicks than sessions (copied from the article):
Hope it helps.
Hi Britney
Thank you for your detailed feedback!
I checked the posts you linked and a few other sources and I think the solution will be the following:
I hope it makes sense.
Run Screaming Frog on your subdomains and check the Images tab in the report, then sort by image size and you'll find the large images.
Download Screaming Frog from here: http://www.screamingfrog.co.uk/seo-spider/
The new Moz Keyword Explorer looks good but its search volume is US based and completely useless for non-US websites.
This is from Rand's post: "while the tool can search any Google domain in any country, the volume numbers will always be for US-volume. In the future, we hope to add volume data for other geos as well."
In the Keyword Difficulty tool, Moz shows Google search volume data, which is similar to what I see in the Google Keyword Planner and Google Search Console. For example, keyword X in the Australian search market has 6-7k searches in the Google Keyword Planner and 8k searches in Moz.
The very same keyword has 118k-300k search volume in the new Keyword Explorer!
Obviously this new search volume is not useful in the Australian market. I often used the Keyword Difficulty tool to identify new keyword opportunities but what can I do to complete the same tasks after they retire the tool?
I had similar issues on large e-commerce websites, where these pages were not in navigation, but were in Google's index, so Webmaster Tools reported tens of thousands of 404s.
Keeping this many 301 redirections would have put a large load on the server, so we made sure that navigation and site search doesn't link to these pages and later Google removed them from its index.
On the other hand, Peter is right. It depends on the ratio of live and broken pages. If you have 100k pages and 60k is 404, then Google will most likely ignore your website's crawling and indexing of new pages. You need to have a very dodgy website to trigger a Panda penalty, though. 404 happens all the time and Google is quite patient with the website admins to fix it.
I assume the US site has a target country specified in its Google Search Console. The answer depends on the competition and the SEO budget, but I suspect that you'd have a hard time making those German language pages ranking in google.de.
If you have an international SEO strategy to use one domain for every country (e.g. /us/ and /de/ folders) then it's a different story.
If you'd have only a few German pages on the US website, then don't bother. Launch the German website on a German domain and start a separate SEO campaign for the German market.
According to the official website, AMP is for content producers and not for e-commerce sites.
Two relevant questions I found in the FAQ:
Who will be able to use Accelerated Mobile Pages?
The project is open to all players in the ecosystem - publishers, consumer platforms, and creators.
What type of content will work best using Accelerated Mobile Pages?
The goal is for all published content, from news stories to videos and from blogs to photographs and GIFs
Hi there
How long has it been since your fixed the language targeting? WMT is pretty slow with report updates and it can take weeks for Google to update them.
Check the cache date of the AU pages. Has Google crawled and indexed these pages since the update? If not, try using the Fetch function in WMT.
Hi Brandon
The Moz robot needs to find the pages somehow and it only crawls HTML links. Wildcards won't work because we can't expect the robot to guess every single character after the /
Can you create a sitemap homepage, which lists all pages?
If it can't be done then your best crawling tool is something like Screaming Frog, where you can submit the URLs manually.
+1 for EGOL
I would play with the pricing strategy instead of using noindex and nofollow on my site. These unwanted service pages might have valuable Page Authority and pass link juice in internal navigation, so noindex and nofollow can potentially hurt the overall organic search performance of your site.
If you don't want Google to crawl these pages looking for new information, simply block crawling in robots.txt but leave them in Google's index.
Ranking in a city 25 miles away for a new business is quite a challenge, even if you do ongoing local SEO. Competitors in the city are probably doing local SEO too and are more important from Google's perspective. Why would Google show a business from a different region if there are many local, trusted service providers?
Getting a local address and optimising that listing might be a cheaper and more effective solution.
Hi Jacob
I tried to check your product pages but they're loading extremely slowly. I have very fast connection, so that's definitely a bad sign. Test your product pages with tools.pingdom.com and www.webpagetest.org to find possible page speed issues. If page load take more than 20-30 seconds and Googlebot experienced the same issue, it might just ignore your product pages and show the category pages instead.
Hi Will
I use Google Tag Manager to add structured data to the website. With GTM, the structured markup won't be in the source code and won't be seen in the browser.
The Google Structured Data Testing Tool picks up the local markup from GTM nicely.
Google is pretty "smart" and will understand that it is an e-commerce site with thousands of products and it won't penalize your site for keyword stuffing, but make sure you follow Google's guidelines, such as: category and product pages have unique page titles, H1 headings, image ALT tags, meta descriptions and all the other basic on-site elements make sense.
Also, it's recommended to use structured data for breadcrumbs, so the search robot will understand the information architecture of your website.
Sometimes it's difficult, but it's also a good practice to add a few sentences to at least the major category and sub-category pages. E.g. some intro text about the wide range of shirt, brands, etc, so this on-page content will make the category pages more unique and more relevant.
Your link goes to a login page. I think you meant this: http://www.seochat.com/c/a/search-engine-optimization-help/hidden-text-in-websites/
Google is most likely smart enough to know these tricks, so I wouldn't waste time by implementing various CSS layer tricks. Try to follow the webmaster guidelines as much as possible.
I agree with Logan.
If the ratio of redirected or broken URLs is too high in your sitemap XML, there is a chance that Google won't crawl it as frequently as it should because the search robot doesn't want to waste resources on these URLs.
The only time when redirected URLs are useful in the sitemap XML is when you're migrating the domain or make IA changes and you want to make sure that the search engine discovers the 301 redirections as quickly as possible.
Hi Rand
What can non-US Moz customers use until you add UK or AU search volumes? We use the keyword difficulty tool's score and search volume reports quite often.
Would you keep the keyword difficulty tool live until you fix the keyword explorer's search volume?
I suppose your top priority keywords are the head terms (e.g. used car) and brand related searches that are targeted by category pages, e.g. used mazda cx 5. The "duplicate" pages are the actual car pages that change on a daily basis, every time they sell or add a new car.
Your priorities are the head terms and category related searches, so having duplicate content on a few dozen mazda cx 5 pages shouldn't be an issue. The website duplicates product descriptions, not articles or blog posts and I'm pretty sure Google is smart enough to understand the concept.
It's unlikely that the actual product pages will rank in Google, because they are changed too frequently and don't have time to build up page authority.
Make sure you don't have thousands of duplicate page titles, by using unique identifiers and even ID numbers in the page titles. E.g. Used [colour] [year] [brand] [model] [mileage]
Hope it helps.
Hi Joshua
subcategory.htm pages will perform just as well as subcategory/ and having .htm in the URL doesn't affect link juice flow at all. .htm or .html are perfectly valid HTML files; however, some prefer having shorter, "nicer" looking URLs. If this is the case and the website is still in the early stages of SEO, then 301 redirect the .htm URLs and make sure every navigation elements links to the non-htm URLs in the future.
In some cases, the slash ending URLs can be considered duplicate pages (even though I'm pretty sure Google will understand the honest mistake), so it's one of the basic SEO recommendations to set redirections and make sure the website navigation doesn't mix the two. Also, SEO tools will keep sending you duplicate page title warnings, so it's better to clean it up as soon as possible.
Hope it helps.
Perfect, I can survive that. Thank you!
PS: the kw difficulty search volume is closer to AdWords than the explorer's US search volume. In a Q&A post I mentioned a head term search volume which has 6-9k searches in AdWords and the kw difficulty tool but 118-300k in kw explorer
I vote for #2. Having brand and model category pages with all the product listings can be more beneficial for SEO, and they can use breadcrumbs with schema markup, which will look nicely in the SERP.
brand
model 1
product 1
product 2
...
model 2
...
Hi Endre
It's difficult to give you insights without seeing the actual pages, their backlink profile and the keywords the pages are targeting.
Maybe a previous SEO agency link built the hell out of the empty search page and Google hasn't noticed that it has no valuable content and has bad user experience. Maybe that search page had good content previous and Google hasn't noticed.
Check the empty page with the Wayback Machine (archive.org) and see what it looked like in the past. Check the link profile of the empty page with more than one tool to find all the links it has (Moz, Majestic, Ahrefs).
I was referring to the navigation breadcrumbs that use can mark up with structured data. Like this: https://developers.google.com/structured-data/breadcrumbs?hl=en
Hi,
We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages.
As far as I know, the page URL won't change and won't have parameters.
Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots.
Is it better to have URL parameters for version B and C of the content? For example:
The dynamic content comes from the server side, so not all pages copy variations are in the default HTML.
I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.