Domains dominating SERPs w/multiple listings
-
I know Cutts addressed this as a potential future update to the Google algo but it's driving me bonkers.. My primary targeted keyword has one of our competitors listed 4 times in a row on the top of page 2. Some of the pages have duplicate page titles and the content is relatively thin. The site has a PR of 2 and a DA of 35. Why on earth are they able to suck up a whole half of a results page?!?!?!
I don't know that there's anything anyone can tell me that will help, but if there's something I missed about this update please let me know.
'snot fair.
-
Okay I love it. This has been a fantastic experiment. I have a good feeling about it all.
-
For the one I originally was posting about I've been all over the board and still am. Back to 15 today and was 10 earlier a few days ago..
I am willing to bet $50 that your rankings for this KW will stabilize on page one after these new pages are in the index for a while.
If one of the new articles is really strong for this KW it might get you a second listing... but you will need a really strong article.
-
Yes for this keyword I have. For the one I originally was posting about I've been all over the board and still am. Back to 15 today and was 10 earlier a few days ago..
But yes, this keyword historically sticks at 2-3 *(the dev added a new page that bumped us down). I guess I was testing to see if I could get multiple pages from my domain listed for this keyword as we were discussing.
-
Thanks for the report.
What has been your historical ranking for this KW... have you been at #3 for a long time?
-
HILARIOUS.
Just after hitting "Post" I went and checked once more. Boom, there we were sitting at the #3 spot (which is just as good. this keyword is for a software service and the site beating me out twice is the software developer themselves so that's fair.)
I swear though it had been a good hour and a half before I posted this. I did make one slight change to a page title in that 90 minutes but I doubt that made the difference in such a short time. Oh Google, you sure know how to get my heart-a-poundin' in the morning.
-
Well my initial test done on a secondary keyword linking 3 different pages describing three different facets of the service was working great until today.
Today my page is no longer listed at all in the SERPs for the targeted keyword. Yesterday it was number 2.
I'm hoping this is just one of Google's little fluctuations but have a feeling it's bigger than that... Can't for the life of me figure where I went wrong. Each page is about something different, titles are different, content is different, all pertaining to the same service sure but still. Basically it's like this:
1.) main service offering page
2.) blog article on statistics of popularity gaining for this service recently (linked to from high authority sites/socials as well)
3.) blog article on the newest version of the service's release which happened last week.
All 3 interconnected of course. Any ideas?
-
Nice work!
Let us know if you see more movement.
-
It's an early result but I've tried doing this with my main targeted keyword and so far it pushed the offender down and me up.. Not sure if it was that which made the difference or all of my efforts combined but so far so good I'll keep you posted on my experiments..
-
I doubt that I will ever sell model warships. I would not want to sell them because I don't know anything about them. And, I am getting close to retiring from retail.
I tried Google's DFP ad server and am getting such good results on my adsense sites that I am moving as much of my time to them as is possible.
-
Nice!
Let us know how it works!
-
Okay Egol you big magic wizard you - I've implemented a few changes and added some media in a method pretty much exactly as you described. So we'll be testing it out (for a different keyword but still highly targeted) and seeing what we can do to grab multiple listings on a SERP.
Just wait til my sales team finds out they're now selling brass widgets..
-
I just have to hope EGOL never gets interested in selling model warships, otherwise I'm going to have to really up my game on my website!
-
Pushes competitors out of the SERPs. Pisses them off. Makes them spend time hopping around mad instead of waging war.
Love it!!
-
One of their pages has two sentences on it. Two.
I usually have a lot more content than two sentences, so might not be my site.
Honestly...... Try something like this in that SERP. Build three pages...
-
Your standard product page. (Brass Widgets for Sale)
-
Nice article about how to use that product, with a few photos, maybe a video. (How to Use Brass Widgets)
-
Short article about how to select the perfect brass widget, with a few more widget photos, maybe another video, add a table that compares colors and sizes. (Brass Widgets: How to Select the Perfect One)
Then, put obvious links connecting each of these three pages. If you have a competitive site in this niche, could be kickass. All of that info might make you look like the brass widget professional and boost your conversion rate.
This is what I do for lots of products.
Pushes competitors out of the SERPs. Pisses them off. Makes them spend time hopping around mad instead of waging war.
-
-
HAHA!
"EEEGGOOOOOOOOOOOOOOOOOLLLLL!!!!!" (yelled like Kirk at Khan)
ugh. One of their pages has two sentences on it. Two.
-
Why on earth are they able to suck up a whole half of a results page?!?!?!
Maybe that is one of my sites
Because instead of fearing keyword cannibalization, they attack it.
Works great in low to moderate competition.
I have positions 1 through 4 for some nice keywords.
'snot fair.
Go Cannibals!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would there be any benefit to creating multiple pages of the same content to target different titles?
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?
Algorithm Updates | | Evosite10 -
One keyword gone in Google SERPs - Fred?
I have an ecommerce site. One keyword, which I use to rank #1 for on Google years ago, I'm now completely gone from the SERP's as of a couple weeks ago. I'm scratching my head here, my other keywords don't seem to have changed much recently. Around mid-March of this year, which seems to line up with the Fred update, I noticed I went from page 3 to middle of page 1 for a few days with this keyword. It was a very happy few days. Then it slipped down and down and hovered around page 6. But as of a couple weeks ago, it's now gone. Before the Fred update, I changed a bunch of product pages within the keyword category that had duplicate content because they were kits of items arranged different ways. So instead of repeating the individual item descriptions over and over in the different kits, I changed the descriptions on the kits to links to the individual items within the kits. After the Fred update, at the end of March, I set all these kit item pages that I reduced to very thin content with just links to noindex. My theory is that the Fred update reset algorithmic penalties for a couple days as it was being introduced. So the penalty of duplicate content that I may have had was lifted since I took out the duplicate content, and I made it back to page one. Then as Fred saw I now had a new penalty of thin content, I got hit and slid back down the rankings. Now that I updated the pages that had very thin content to be noindex, do you think I'll see a return of the keyword to a higher position? Or any other theories or suggestions? I remember seeing keywords disappear and come back stronger years ago, but haven't seen anything like this in a long time.
Algorithm Updates | | head_dunce0 -
Domain Authority Keeps Dropping & FRED
Hi Moz! I've seen a big drop in Domain Authority 31 > 22 recently. I need a plan of what to sort out first, here are the points I know we need to improve: Page Speed Quality content - guides, blogs, videos Better UX experience to improve page engagement Backlinks - quality earned links & improvement of presence on social media This is our site http://www.key.co.uk/en/key/ I am the only SEO, with a small content team - who only really work on adding new products to the site. Our dev team are in France and we can be restricted by them. But I'm worried & I need a plan of what to tackle first to help improve this. We also saw keywords drop out in March - I'm assuming after Fred, some keywords aren't ones I would worry about, but then some are - for example - http://www.key.co.uk/en/key/dollies-load-movers-door-skates this page ranked at position 6 for Dollies - now dropped out altogether. Any ideas are welcome - help 🙂
Algorithm Updates | | BeckyKey2 -
How to determine the best keyword strategy/purpose for a blog in 2014?
Currently our blog has been used to add content to our site targeting desired keywords (fairly top-level). For example, if we wanted organic traffic for "Some City Contractors" (by no means a longtail), we would write a blog using this key term in the Title, url, a sub heading perhaps and a couple variations of the term throughout any subheadings or body copy. I think the idea was that since there was so much work to be done to get the static site pages optimized (rewriting that copy), we just decided to crank out fresh content targeting these high level KWs, assuming a search engine result is a result and as long as we got real estate there, a click and there was a link to the relevant site page in that article, we were golden (well, maybe not golden, but good). We are now building a new, responsive site and taking care to make sure that the site's relevant pages are nicely optimized. Higher level page are optimized for high-level KWs and sub pages target longer tail KWs identified in KW research. Along the way an SEO said it was bad that so many of our blogs were better optimized for key terms than the actual site pages (i.e. service pages, things you would find in the main nav.) This does make some sense to me so... So what is the new purpose for our blogs in this new age of Google and ever-increasing social influence? Should we forget about focusing on KWs already addressed within the site's core? Focus more on interesting, super long-tails that maybe don't have a ton of traffic, but are relevant (and oh by they way, something like 3 million terms are searched for the first time each day, right?)? Or forget the keywords, as long as the topic is relevant and interesting the real pay-off is in social interactions. I'm really interested to see if this results in clear-cut answer or more of a lengthy discussion...
Algorithm Updates | | vernonmack1 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
How long does it take for a new website to start showing in the SERP'S
I launched my website about 6 weeks ago. It was indexed fairly quickly. But it is not showing up in the Google SERP. I did do the on page SEO and followed the best practise's for my website. I have also been checking webmaster tools and it tells me that there is no errors with my site. I also ran it through the seomoz on page seo analyzer and again no real big issues. According to seomoz I had 1 duplicate content issue with my blog posts, which i corrected. I understand it takes some time, but any ideas of how much time? And f.y.i it's a Canadian website. So it should be a lot easier to rank as well. Could my site be caught in the Google 'sandbox effect' ? Any thoughts on this would be greatly appreciated.
Algorithm Updates | | CurtCarroll0 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1