Low on Google ranking despite error-free!?
-
Hi all,
I'm following up on a recent post i've made about our indexing and especially ranking problems in Google: http://moz.com/community/q/seo-impact-classifieds-website
Thanks to all good comments we managed to get rid of most of our crawl errors and as a result our high priority /duplicated content decreased from +22k to 270. In short, we created canonical urls, run an xml sitemap, used url parameters in GWT, created h1 and meta description for each ad posted by users etc.
I then used google fetch a few times (3 weeks ago and last week) both for desktop and mobile version for re-approval. Nothing really improves in google rankings (all our core keywords are ranked +50)since months now: yet yahoo and bing organic traffic went up and is 3x higher than google's.
In the meanwhile we're running paid campagins on facebook and adwords since months already to keep traffic consistent, yet this is eating up our budget, even though our ctr and conversion rates are good. I realize we might have to create more content on-site and through social media, but right now our social media traffic is already around 50% and we are using more of twitter and google+ as well since recently.
Our organic traffic is only 14%; with google only a third of that. In the end, I believe this breakdown should look more something like organic 50%-70%, (paid)social,referral and direct traffic. 50%-30%...
I can't believe we are hit by a penalty although this looks like it is the case. Especially while yahoo and bing traffic goes up and google does not. Should I wait for a signal once our site is "approved" again through GWT fetch? Or am i missing something that i need to check as well to improve these rankings?
Thanks for your help!
Ivor
ps: ask me for additional stats or info in a pm if needed!
-
Hi Dirk,
Thanks for your valuable input. We will work on it and revert whenever I have additional questions.
Bedankt :-)!
Ivor
-
To be honest, I don't think that it's the duplicate pages which are the biggest problem, but at this point rather the on-page. Most of your potential users will search for "<> for sale in <<location>" so you should try to make sure that these kind of pages exist, with on-page elements. It will also help to make your site more flat - currently 46% of your pages are >=5 clicks from the home page. </location>
A part from that, personal ads sites tend to get a lot of traffic from long tail keywords, so the more ads you have, the more potential you have to attract traffic. Unfortunately, if you don't have a lot of traffic, you won't get a lot of ads, so it's a bit of a vicious circle you're in. When I crawled your site, I found about 450 HTML pages (unique), which remains quite small.
Not sure how the competition is in Spain, but normally it's a quite competitive market you're in, with a lot of players which are well established since quite a few years. So even if you do everything right, they will still beat you because they have a better link profile & reputation. It could be a better to focus on one specific niche (cars, motorcycles, boats,... ) and be really great on that niche, and expand to other niches once the first one gains momentum.
Social factors have some importance, but remember that these links are no follow, and that Google is not indexing all of Facebook.
rgds,
Dirk
-
Hi DC1611,
Thanks for your answers. It is true that we need to optimize further these pages, although I don't understand why few of these duplicates relative to our overall site would cause such problems? Just wandering when and how we will be recognized for improved ranking, as keyword optimization is not the only factor? Social factors f.ex. should also be of equal importance at least? Given the amount of combinations that arise out of our queries on-site, it is a huge challenge to optimize for them...Nevertheless we are working on updating our major landing pages that have missing descriptions...
Btw useful tool you mentioned, Screaming Frog, does give a thorough overview and a bit more useful than pulling out csv reports constantly from other crawling solutions.
-
I noticed you're Belgian, you could also check what other sites are doing like Markplaats, 2ehands,.... in terms of site structure and on-page optimisation. Crawling them with a tool like Screaming Frog will certainly give you some inspiration. I don't know the Spanish sites for personal ads, but check them as well. If you use Semrush you can check for which keywords they rank, how their landing pages look like and how they are optimised.
-
Hi,
You still have some duplicates - for each page the https & http version exists - I would choose the https version & redirect the non-https version to this one.
There are a lot of pages which have no H1or an H1 that doesn't really add value. Title idem - not always covering the content:
Example: http://www.mercadonline.es/anuncios-sevilla -
Title: Resultados de la búsqueda - MercadOnline.es
H1: missingApart from this - I guess your target audience is looking for quite specific info (like BMW for sale Sevilla) - as far as I can see you don't really have optimised pages for these queries. A potential landing page for this query would be: http://www.mercadonline.es/ads/motos-motocicletas/12162 - but Title, H1, Metadescription are not adapted
Title: Motos, motocicletas - 12162 - MercadOnline.es
Meta description: En Mercadonline.es encontrará diariamente una amplia gama de motos y motocicletas - 12162
H1: Vehículos=> try to adapt these to make it more specific to the corresponding search query.
Hope this helps;
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
Ranking with subdomain - Urgent
Does anyone have any experience if it is possible to get a website ranking on a subdomain? I'm trying out a business idea and need to keep costs to an absolute minimum. I have a site which I designed in wix.com they give a free subdomain and I want to know if there's any chance of getting it to rank Thanks
Intermediate & Advanced SEO | | seoman100 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
What are the ranking factors for "Google News"? How can we compete?
We have a few sport news websites that are picked up by Google News. Once in a blue moon, one of our articles ranks for a great keyword and shows in one of the 3 listings that Google News has in SERPS. Any tips on how we can we optimise more of our articles to compete in these 3 positions?
Intermediate & Advanced SEO | | betnl0 -
Low Page ranking and domain authoriy
Hi, Even after building links through white hat, publishing fresh content and infographics every week and links from more than 1700 domains our site’s domain authority is 49 and page authority is 57. Any idea what we are doing wrong? Thank you Chris Adams
Intermediate & Advanced SEO | | INN0 -
Aggressive Loss in Rankings
I recently launched a website and used local directories, web directories and guest blogging as a means of getting links to the site. Within 60 days the site was ranking on the first page for a highly competitive keyword. It was hovering on the first page for about two weeks before it plummeted in ranking to near 200. I am not finding any crawl errors or duplicate content issues on the site. Could my links have caused this massive decrease in rankings?
Intermediate & Advanced SEO | | mj7750 -
Squarespace Errors
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors. These are primarily: Duplicate Page Title Duplicate Page Content Client Error (4xx) We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this? rainforestcruises.com thanks.
Intermediate & Advanced SEO | | RainforestCruises0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0