Software assisted meta data
-
I was recently contacted by an SEO firm that did a search on my site and said it had a low index. Out of 5,000+ pages only 800 keywords was ranking. They said there is much improvement for adjusting my meta data for indexing. They said there is a software that does this for you. Does anyone have any experience with this? Does this sound true what they are explaining? What is this software and how much does it cost? Any help would be greatly appreciated.
-
I run an e-commerce site which is why I have so many pages. Each product acts as its own page. Each product has a title (product name) and each has its own URL as well as short description, long description and meta description. Maybe I need to go further into the description and add more content and double check that they are in fact being crawled by google as you mention.
-
First rule of hiring an SEO firm, don't hire a firm that reaches out to you randomly and tells you that you have something broken and they can fix it. FYI, I also know a prince in Nigeria who needs access to your bank account and would be happy to pay you for it.
That said, the whole pages on site vs pages ranking in Google is a valid measure. If most of the pages of your site are not ranked in Google, then that may be an indicator that you have a lot of low quality pages and this could be an issue. This type of metric is a good one for check in the health of your site.
That said, your have to figure out if this matters to your site or not and it is pretty easy to do. Various ways to look at this.
Log into Search Console and if you have a sitemap submitted on your main dashboard for sitemaps you will see URLs submitted vs URLs indexed. At a basic level, how many pages did you show to Google and then how many pages did Google say, yes, these are good enough to be indexed.
Use ScreamingFrog or the MozCrawl, see how many URLs are found by crawling your website. How does this total compare to what you show in your sitemap? Do you have a bunch of pages that are just tag cloud pages or a page that is just a resorted or printer friendly version (aka duplicate) of another page? You probably need to block access to those (and make sure they are not in your sitemap).
ScreamingFrog has a API to connect to your GA data. You can then also pull what type of search traffic do these pages get. This gets back to your ranking question. If you have a page that Google is ranking for something, it should get some search traffic to it. Depending on how much traffic you get overall to your site, you may need to look at a 3 to 6 month time period. You can now say, out of all of the URLs on your site, if there is a group of URLs that is not getting any traffic from Google for the past 6 months, they are probably not ranking for any key words. Why is that? Look at those pages. Are they crappy pages with very little or duplicate content? Updating a meta tag will not do anything if you have a crappy page to start with. Maybe you need to noindex them or 404 them and get them off your site. Maybe you need to add content, write more, include some graphics etc. Maybe you have duplicate or similar pages you need to combine into one super page and redirect the other page to it.
Good luck!
-
Hi Nicholas,
OK, then I wouldn't worry about their comments on your meta data, it's far better to create good descriptions and titles manually than to use automated ones- I'd only ever recommend that as an interim solution, or where you have too much data to ever invest in manual changes.
There's always improvements that can be made to SERPs, but I don't think they have the solution for you- I'd invest in the different areas of SEO, e.g. ensuring your pages are well optimised, checking technical points, and working on usability and great content. You could do this with an agency/SEO consultant, or yourself/a staff member. The Moz beginner guide to SEO is a great place to start!
Hope this helps,
Zoe -
I am using Magento. I have done a decent job adding proper descriptions and meta data. I was just outreached by a company called Grouphigh who says I am missing the boat somewhere. The rep mentioned my SERP is growing but has so much more potential. He said he used a tool to track the amount of keywords that I am currently showing up in organic searches and that it was very low in comparison to the amount of pages I have. Does this statement hold water? Is there something that I should be doing differently to increase this?
-
Hi Nicholas,
I'd avoid any automated programmes like this- did they explain how it works? Does it just pull the first line of your page text or product description or something similar to be your meta description? Lots of software can do this, but it's not necessarily the best way- your meta descriptions should be compelling for users who see them in search, and they don't directly impact your ranking anyway. Your meta titles can also be generated automatically, most CMS systems will do this for you automatically unless you override the description, or you'll be able to change your settings or add a module/plugin to do this (e.g. so that a product title is the meta title, followed by your brand name). What is your website built on? E.g. Wordpress, Magento, PrestaShop?
I wouldn't trust this company though, as you'll be able to make such improvements yourself, and worst case scenario they might hurt your rankings.
Hope this helps!
Zoe -
Sounds bullshit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site moved. Unable to index page : Noindex detected in robots meta tag?!
Hope someone can shed some light on this: We moved our smaller site (into the main site ( different domains) . The smaller site that was moved ( https://www.bluegreenrentals.com)
Intermediate & Advanced SEO | | bgvsiteadmin
Directory where the site was moved (https://www.bluegreenvacations.com/rentals) Each page from the old site was 301 redirected to the appropriate page under .com/rentals. But we are seeing a significant drop in rankings and traffic., as I am unable to request a change of address in Google search console (a separate issue that I can elaborate on). Lots of (301 redirect) new destination pages are not indexed. When Inspected, I got a message : Indexing allowed? No: 'index' detected in 'robots' meta tagAll pages are set as Index/follow and there are no restrictions in robots.txtHere is an example URL :https://www.bluegreenvacations.com/rentals/resorts/colorado/innsbruck-aspen/Can someone take a look and share an opinion on this issue?Thank you!0 -
Stuctured data for different sized packages
Hi all, We are currently working on implementing structured data to our webshop, for SEO and for google shopping. We sell stones, pebbles, gravel etc. (to be used in gardens).We offer each product in different sized bags. Customers can buy 20KG minibags, 250KG minibags, 500KG midibags, 1500KG bigbags and bulk quantities (ranging from 3000KG up to 35000KG). For example, we sell Black Beach Pebbles in the bags as described as above (+ the bulk quantities). We have a product page for these Black Beach Pebbles and on that product page customers can choose the desired bag or desired bulk quantity. For google shopping, visitors land on these productpages. A while back that caused a problem; the landing page contained different prices so sometimes google could not match the prices on the landing page with the prices in our productfeed (because of course, each bag has a different price). So, besides SEO, another reason for us to implement structured data. I have two questions regarding the implementation. 1. For the landing page as described above, the idea now is to mark 1 product with different offers (an offer for each bag + an offer for the bulk quantities). This raises a problem regarding the bulk quantities; the price of the bulk quantity depends on the chosen quantity (customers can pick the desired bulk quantity using a dropdown) on the productpage. How should we markup the price? The idea know is to markup 1 product with different offers for each bag and 1 aggregate offer for the bulk quantities (and using the lowest price, so the price for the smallest bulk quantity). So, for the Black Beach pebbles: Product = Black beach Pebbles
Intermediate & Advanced SEO | | AMAGARD
Offer (= 20KG minibag)
Price = ...
Offer (= 250KG minibag)
Price = ...
Offer (= 500KG midibag)
Price = ...
Offer (= 1500KG bigbag)
Price = ...
AggregateOffer (= Bulk quantities)
Lowprice = ... Is combining Offer and AggregateOffer within 1 product the right solution? 2. For the 1500KG Bigbags and bulk quantities we have separate landing pages (because people specifically search for bigbags and bulk quantities). So those landing pages are dedicated to bigbags / bulk quantities. How should we mark up those pages? Should we for example just do this: On the page for te bigbag:
Product = Black Beach Pebbles 1500KG bigbag
Offer (=Black Beach Pebbles 1500KG bigbag)
Price =.... and on the page for the bulk quantities: Product = Black Beach Pebbles bulk quantities
AggregateOffer (=Black Beach Pebbles bulk quantities)
Lowprice=...... Could that cause any confusion for google, because on the productpage with all the available bags, the bigbag is an offer for the product 'Beach Pebbles Black'. And on the second page it is a product on its own. Thanks in advance! Best!1 -
OK to have multiple local business structured data on one website?
Hello there, I'm working on implementing local business structured data for a website but we have multiple offices. Is it okay from a Google perspective to add different local business data on different pages of the website, or can I only use one set of local business data site wide? Many thanks, Gill.
Intermediate & Advanced SEO | | Cannetastic0 -
Best Format to Index a Large Data Set
Hello Moz, I've been working on a piece of content that has 2 large data sets I have organized into a table that I would like indexed and want to know the best way to code the data for search engines while still providing a good visual experience for users. I actually created the piece 3 times and am deciding on which format to go with and I would love your professional opinions. 1. HTML5 - all the data is coded using tags and contains all the data on page in the . This is the most straight forward method and I know this will get indexed; however, it is also the ugliest looking table and least functional. 2. Java - I used google charts and loaded all the data into a
Intermediate & Advanced SEO | | jwalker880 -
Implementation of structured data = a significant drop in positions in the results
Hi friends,
Intermediate & Advanced SEO | | zkouska
In one of our websites (ecommerce) with the implementation of structured data we noticed a significant drop in positions in the results.
Does anyone have a similar experience? Thanks... 🙂0 -
WMT Showing Duplicate Meta Description Issues Altough Posts Were Redirected
Dear Moz Community, Some time ago we've change the structure of our website and we've redirected the old URL's to the new ones. About 2,000 posts were redirected at that time. While checking Webmaster Tools a few days ago I've discovered that about 500 duplicate meta-description issues appear in the "HTML Improvements" area. To my surprise, altough the old posts were redirected to the new path, WMT sees the description of the old posts similar with the one of the new post. Moreover, after changing the structure all meta-descriptions were modified and they weren't the same used before the restructure. For example I've redirected /blog/taxi-transfer-from-merton-sw19-to-london-city-airport/ to /destinations/greater-london/merton-sw19/taxi-transfer-to-london-city-airport-from-merton/ Now they are shown as having duplicate content. I've checked the redirects and they are working. I get the same error from the redirected pages for about 150 titles. Did anyone else get this errors or can you please offer me some suggestions about how I can fix this? Thank you in advance! Tiberiu
Intermediate & Advanced SEO | | Tiberiu0 -
Bypassing Google, Data Highlighter and Webmaster tools
eLLo! Has anyone used Data Highlighter? I've had colleagues mentioning a jump in CTR after using the data highlighter on pages. Thought I'll do the same and went into my webmaster tools but I've hit a brick wall. Whenever I highlight a product page, my country selector pops up and I'm unable to highlight a product page. A colleague of mine mentioned to bypass google by basing it on user agent, this will allow you to avoid the country selector. But if I bypass Google, wouldn't it affect Google Analytics, Indexing etc?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Meta NOINDEX... how long before Google drops dupe pages?
Hi, I have a lot of near dupe content caused by URL params - so I have applied: How long will it take for this to take effect? It's been over a week now, I have done some removal with GWT removal tool, but still no major indexed pages dropped. Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100