Muna, I see these pages indexed:
Are there other pages that don't show up?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Muna, I see these pages indexed:
Are there other pages that don't show up?
That's not very hard for a penguin penalty. It would seem more like the natural course of things in the serps. If you keep going down, that would be a bigger problem but if you steady at the 3,4 range, just keep working on your content--magic is a great subject around which to create interesting, engaging content.
Go through the process of claiming the second and third listings and once you've claimed and verified them and they're in your google places dashboard, you should be able to delete them.
All that publishing should help support the keywords you've chosen for your primary domain. If you are publishing pithy news that's thematically relevant to your main keywords, you could (sparingly) link from those articles to your news categories, in order to give them greater relevance. If your news articles are thin and not in any way relevant to your primary terms then by all means, host them on another domain.
Bob,
Real reviews will happen as a result of people's interaction your company and they will inevitably put them where they think best. Note that, in order to write a review on a company's Google+ page, the reviewing customer must also have a G+ account - just having a gmail account isn't enough anymore. Those without a G+ account will have to writer their reviews elsewhere.
Reviews aren't the same as citations. Citations typically refer to your name, address, and phone number and those you really need to get from a wide variety of sources to strengthen your G+ Local business page visibility. You can started finding out about and populating those sources at getlisted.org.
Here's a good article by Mike Blumenthal on encouraging customer reviews and for good measure, here's what Google says about making sure your reviews on G+ are useful, honest, and written by real people.
Having thousands of links from a handful of sites that are exactly the same can hurt you.
Yes, press releases can help with SEO--to a small degree. Don't build your back link strategy around them, though.
My bad--I misread your question Gael. Miriam's right about that.
I'd focus on local search for them. Make sure to set up a Google+ account for the company and set up a G+ Local business page for each location. Then I'd focus on networking with any local neighborhood groups in each of their areas.
I don't think /...-reviews is ranking higher because of the content so I'd bet that putting the content from /...-review onto /...-reviews would only help /...-reviews to rank better. You could make both pages exactly the same and rel=canonical /...-reviews to /...-review . Or you could simply revise /...-reviews and take the other page down.
You can't get there from here.
As far as I know, we don't yet have a high quality dynamic content creation tool that populate the content areas you speak of with good enough content to make it worth your while--not with today's Google.
I think it's a great idea to make a better article, working to top it in the search results, I would say is different matter. I'm not sure of the keyword you're going after but you're likely going to spend all your time and effort trying to make that page rank and it's going to take forever--if you can do it at all. Instead, use your time to write a blog and on social networking to find, reach out to, and interact with those potential clients. Google+ is probably where you can get the most traction the fastest.
Noindexing them is a good way to go--especially if you have a high number and high percentage of pages with duplicate content.
No, you don't have to rel canonical one page to another if you are already 301ing that same page to the other.
What's wrong with the content?
The meta noindex will prevent the pages from showing up in search results and from impacting search results if they've got duplicate content--regardless of whether they're in the sitemap or not. If you've got a lot of spammy outbound links, think about nofollowing, too.
I always just ask my webmaster friend at the NYT to sprinkle in a few links when my clients need them. ; )
Basically, though, I wouldn't count on using any redirects to get you around a penguin penalty or shorten its duration. Typically, for those who get out of penguin, it takes a lot of work on dealing with the links combined with a lot of work on content--so much so, that it seems to take a total commitment to the domain to make it happen.
It sounds like you may be preparing for the worst but that you haven't been impacted yet. If that's the case, I'd just wait it out, at this point, and if you get hit, make a commitment to a domain and stick with it.
You should go ahead and 301 redirect your different home pages to a single one. If not, your problem of getting links to all 3 may not cease. I'm not sure, however how OSE handles canonicalization if the pages are not redirected. Anyone else out there know?
Make those pages co-citation worthy and worth linking to from local directories and other local resources, just like you would do for a home page if you were trying to get that to rank for local terms. Be sure to use your geo-terms on those pages and use those terms in your internal linking.
Sounds like that will likely get what traffic you have currently going to www over to the new domain but I don't think it's going to help you recover in any way. You're either going to have to work hard at eliminating the links to the www site and/or work hard to build up authority for the new domain. If you feel reviving the www site is out of the question, this could be the way to go as I haven't heard of anyone speaking of repercussions from 302ing visitors from one domain to another due to Penguin.
Micheal,
You can cut down on your time by closely investigating which pages have links going to them and which pages are bringing in search traffic and focusing your redirects primarily on those pages. If current pages are not linked to or bringing in search traffic, there's not a strong reason to have to redirect those.
Jodi, the report you're looking for in SEOmoz is found in the Campaign Overview| Manage Keywords | Find New Keywords tab. This shows you what keywords are currently bringing you traffic and how many visits you're getting from them. Be sure you connect your google analytics to your seomoz campaign to get this info.
Words only provide relevance to the search term, they don't provide authority or trust. Generally, removing errors ensures that relevance is recognized by Google and that the authority and trust from external sources can be spread correctly to pages on the domain.
It may be time to regroup and develop a greater understanding of authority and trust, as applied to links and as applied to social networking and engagement.
Thinking in terms of this specific title, I'd remove:
De , national, collection
Another thing to think about: if a word in your title isn't important enough to use in the item's description in the body of the page, it's nowhere near not important enough to use in your title.
Wouldn't a noindex meta tag on each page take care of it?
Unless you have an established relationship with the competitor and you're communicating to them directly, and you're the one discussing the guest post with them, and there is discussion and understanding between the both of you of what value you each should get of of this guest post arrangement, I wouldn't do it. In the end, it's not going to be worth anyone's time.
Don't put the link on all the pages. Put a link on the home page and on a few on other popular pages on the wordpress site and leave it at that.
Would having two home pages, each with primary navigation to the current season's site and the footer having navigation for the other season site work and then switching those home pages out as seasons change be a solution?
But, if the content is the same between the two domains, why have two domains in the first place?
sorry, it ahrefs.com
Are you sure that that particular page has back links? If you are, what is the quality of the pages those links are on?
Just like your browser knows when all the components of a page have finished loading, It's my understanding that the bot knows too. How long is it taking and are there any other widgets you could use in it's place? Has anyone else commented on the slowness of that widget?
Have you tried majesticseo.com or hrefs.com to see if they can provide back link info for that page?
If you're thinking in those terms, you're likely to be going down the wrong road. If it smells like link building at this early stage of the game, it's going to smell more like link building in the future.
But to answer your question, think of how someone who's reading a well thought-out article that you've published about how a specific feature of your software helped a small company become more successful or an article about how a client was able to significantly reduce the turnaround time of replacing a valuable employee with your software might share that information with a friend or colleague and then calculate from there what percentage of time it might simply be "HR Software".
James,
Taking into account the numbers you get in the tool are far from exact and the traffic estimates that were extrapolated from AOL click through data released in 2006, your numbers could certainly be considered right on the mark. It's very surprising to recognize the difference in traffic between just a couple of places in the search results, isn't it!
WM, here's the most recent report on ranking factors put out by seomoz. http://www.seomoz.org/article/search-ranking-factors. Generally speaking, when you're just getting started, you're better off applying what you learn to your site than to spending time on how it applies to your competitors. Keep working, you'll get there.
You won't be penalized for that, for that alone. There's no reason why that would seem inappropriate if someone had their images turned off on their browser and those works were visible in place of your logo.
Yeah, that's not good. You have your redirect set up improperly in your .htaccess file? Take a look at this info to figure out what you may need to adjust.
Hard to say Karl. How strongly the duplicate content is impacting the canonical content can differ greatly by example. Make sure you document the strength and rankings of your canonical content and once your changes have percolated through the algorithm, you'll be able to see what impact they may have had. If this site is being SEO'd for the first time, be sure you're also looking for other issues besides just duplicate content.
In addition to what AndieF said, there's the possibility it could be seen like this too: Why should google give you any value for a back link from a site to which you're not going to attest to its credibility (via a do follow link)?
If your client wants to maximize their SEO dollar, they need dig deep and divine what really, really makes them unique in their market in order that they're able to present a worthwhile value proposition to visitors (prospective customers). For many, if not most small companies that's not an easy task in itself. If they're really working hard to network and provide sharable content based on their brand, their value proposition, and their prospect's needs, they're not going to have the time or resources to do if for multiple sites--and they won't need to.
If you let them think short term and only about content based on keywords, your value to them is going to be minimized, the value they get out of their domian(s) is going to be minimized and in a year from now, neither of you are going to be able to look back with pride at the SEO work that was done on this project. [end proselytizing]
As an aside, by the time you get to renewal time for next year's listing in Yahoo, you may want to have already looked at other ways to spend that 300 bucks.
I haven't heard of anything with all those features but screaming frog will crawl the site and give you the links to and from a page. It might be a start for you.
The goal is to identify the audience, learn who their influencers are, create content that meets their needs as prospective customers, and which promotes sharing. You have to be tuned in to the audience and you have to be tuned in to the business objectives and you have to be tuned in to the product to get the most out of content.
It should be "Journeyperson" if we're going down that road. Otherwise, SEOmoz would be in the position of having to assign a gender bias based on only an name and a picture. I don't think that's enough these days.
Remember Madlena, everyone says "It's all about good content" but what it's really all about is the social engagement that happens when good content reaches the right audience. If you're not spending as much time working to get your content in front of the site's specific audience as you are writing the articles, you'll not achieve the results you're looking for out of that that content.
You may want to give screaming frog or xenu's link sleuth. They should be able to crawl you site and give you the info you need.
Hanzo, what is it that you need to do that the crawl test isn't helping you with?
It might likely play out sorta the way links do--deep content/less linked-to content/old content equals less-crawled content. Maybe you could have 1,000,000 authors on a site but the ones that derive the most value from it are the most recent, the ones on the most active pages, the ones on the most linked-to pages.