Ok the first one, or both, you can easily get away with doing:
RedirectMatch /blogs/fleet-management-and-tracking(.*)the-easy-way/ http://www.newplace.com/here-it-is/
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Ok the first one, or both, you can easily get away with doing:
RedirectMatch /blogs/fleet-management-and-tracking(.*)the-easy-way/ http://www.newplace.com/here-it-is/
Can you post examples of the url, and what you have in your htaccess? This would make it much easier to diagnose.
I think I understand what your doing about hiding part of the text. I also agree with EGOL about, if that is what people are looking for just give it to them.
Will it hurt your SEO efforts? I would say, no it will not. You are not intentionally hiding text for the sake of cheating the systems, you are doing it to help the user experience, something that I have heard Google employees speak of many times. What's good for the user is good for SEO.
I would test this in one of the many ways possible, heatmaps, clicktracking, etc. to see what the percentage of people reveling the hidden text is, if it's more than say 40% I would show more, simply because the other 60% might be interested, but are not willing to got the extra effort and may be going somewhere else.
Hello Sophy.
A comments.
In regards to the canonical, even if they come in with an affiliate link this will reduce or eliminate your duplicate page issue. If you have any way for a page to have duplicate content, use a canonical, if you have a page that is generated dynamically, figure out the best url for it and create a canonical for it.
If your having a high bounce rate, I'd look at the site, and see what people are doing. I would point more towards a
Consider using heatmaps, or using on of the survey sites that will send 30+ people to your site and fill out a form stating what they likes, didn't like, couldn't figure out, etc.
As a web developer myself I would have, and have, coded a blog system myself. However, about a year ago I started working with WordPress and find that it is much easier to use, and I don't have to code, or design, 98% of the things I want it to do.
I can also install and configure WordPress in under 5 minutes, I certainly can't code it that fast. And all the little nuances, like text formatting, special characters, security, etc. are already built in.
Save your time coding and use WordPress, then use all the spare time creating a great blog people want to visit.
It sounds like you are referring to this mozinar:
http://www.seomoz.org/webinars/conversion-optimization-for-local-businesses
I just went through it pretty quick and noticed that:
you don't have an xml sitemap, usually named sitemap.xml
You are using javascript for your drop down menu items,
The 2 above factors will keep all engines from being able to crawl your site fully.
What I would do is just create a sitemap in the footer so that there is a physical link to each page that is not javascript based.
Personally I would do a clean start on the new domain, and 301 or canonical the current pages. If you have a new version why even bother seeing if there are problems with it. One thing I have noticed is that when setting up a new domain, the first time Google Indexes a page it 'seems' to keep that page longer, so the older page would remain in the index when you have a new revised page. If you just go live with the new version, block the search engines with a robot.txt, check it, test it, check it again, then remove the block and forward you could be up and running faster, in my own experience.
There are a handful of other ways, one of them being software packages. I know there are others out there but the only one I can comment on is 'Link Assistant'. It is part of the SEO Power Suite, written in Java, can find possible link partners, etc. and fairly easy to use. I don't want to sound like I'm selling it because I really don't use it, just have it installed and played with it a few times.
Aside from that you could write, or have someone else write a database application so it would be stored somewhere else except your desktop, and automate some of the tasks, like link checking, PR & PA checking, etc.
If it doesn't work like you need let me know, you might need to do it in scripting.
Wow, nice follow up Daniel. I posted it early in the morning while doing my rounds and should have given a more detailed post like you just did. Thank you for expanding on it for everyone.
I agree it is labor intensive, but as you mentioned it can pay off.
EGOL listed a few items that I tend to overlook because I am not doing as much content distribution as I should.
Redirect doesn't allow for (.*).
What your looking for either: RewriteRule to show the other page (which it doesn't sound like your looking for)
RewriteRule /category/diamond-pendants/nstart/1/start/(.*) /category/pendants/nstart/1/start/$1 [QSA,L]
Or the hardly used: RedirectMatch
RedirectMatch /category/diamond-pendants/nstart/1/start/(.*) http://www.provada.com/category/pendants/nstart/1/start/$1
Ya, the back-links are a little excessive, wonder who they are really competing against. Without spending time on it I'm betting there is another competitor that is also doing a massive back-linking campaign.
As Cyle stated, look at the different tools SEOmoz offers to do a full analysis of it if you plan to compete.
Here is one possible way to handle the situation:
http://www.seomoz.org/blog/tracking-traffic-from-google-places-in-google-analytics
Business directories work wonders here.
Because of the 5,937 incoming links compared to the 6 the next highest link has.
In that case I would be curious what the answer is as well because I have the same situation with pages that are named "Example Title | Page 1 | Company Name", "Example Title | Page 2 | Company Name". I'm assuming the algorithm looks the percentage difference to dictate whether a title is different, not the actual different.
The above would drop over 500 Duplicate Page Title errors if the Page 1 & 2 were considered actually different.
Did you look at the pages themselves that they are showing the link to? I had the same problem when I started and it was because of the dynamic linking. RogerBot was able to crawl through pages that were auto generated, example a previous page link that would allow you to go back 6 years by stepping through each month, I figure no human would follow that link when it said "There is no data for this or any previous month", however a bot would just keep going and see duplicate content pages.
Note that there are "Duplicate Content" and "Duplicate Title" items, what you explain with the company title would be a duplicate title issue.
Try this: http://www.stevenferrino.com/scripts/google-api-search.php
And would someone please tell me how to add links that open, like Barry's Soap example
Moved over to: http://code.google.com/apis/customsearch/v1/overview.html
If you can't find an example that helps let me know.
Not a function, and I didn't mean to state that you had to use PhpED to see the problem, just pointing out how I did find it.
With any page loaded, on the right hand side you will see a handful of tabs, next to that is a pinkish bar that will have a Red circle, indicating that there is an issue.
Ok, you haven't stated how big the site is. As I already stated, Google will not show you everything it has in it's index, Yahoo will give 1000, SEOMoz might have additional, also check your Google Webmaster Tools (if you have that setup).
The second thing to keep in mind is incoming links from other places. It sounds like there was no housekeeping before the restructure, so I would keep an eye on the web server logs, analytics, etc. and add 301's for anything else that comes in that doesn't exist.
It's not just about Google, it's also about the user experience. Going to a non-existent page can give the impression that whatever they are looking for is no longer mentioned on your website, which potentially looses customers.
Well here it is for those paying attention to this thread:
http://www.stevenferrino.com/scripts/redirect-parser.php
Not sure if posting a link will work, they tend not to for me, you can always copy and paste.
I'm considering the YOUMoz addition and already sent you an email Jennifer
I just copied the html and threw it into PhpEd and noticed right off the bat you are missing a quotation for the gb_styles.css
I agree. I do it for the user, not for Google, which I think is what Mr. Cutts says to do all the time.
There might be an open ended tag somewhere that is causing the program to choke. Can you provide a link?
If they are staying on your site, it is not a bounce. In essence they are going from one page of your site to another, the fact that it is being referred from inside an iframe shouldn't matter.
Completely agree with Damien. If they don't exist but Webmaster Tools is showing them, 301 them, there has to be a link somewhere on the internet that is causing them to think there is. I would also go through the server logs to see if there is any additional information like a referring page to the non-existent ones.
Another reason for doing it is for the user experience. They no longer have to scroll back up to the top to change pages. The one problem that I ran into was having too many links on one page by doing this, I fixed it by using an iframe instead. It looks the same, and still provides the linking to the user that I wanted.
Looking over this real quickly and running the 2 sites listed, yours and wholesale flights, through OpenSite Explorer, I see that yours mostly has img alt links titled "Lets fly cheaper", some of your other high PA links are titled: "Ramon Van Meer", and "Cheap International Flights", none of which are your target keyword mentioned above.
Your Wholesale Flight competitor has links titled "Discount business class tickets", your desired keyword.
You would be better off with text, not img alt links. See if you can get those changed out.
Your welcome. If that fully answered your question please mark it as answered.
As Google will not show you everything, even using the site command, I use Yahoo SiteExplorer:
http://siteexplorer.search.yahoo.com/search?p=seomoz.org&bwm=i&bwmo=d&bwmf=s
and wrote a PHP script to take the TSV it exports and create a line for each page. I could probably make that available for use one one of my sites.
I had the same problem on http://www.tokenrock.com because I was doing a lot of URL Rewriting, it's a CMS system I wrote, but the same issue apply. I went from 7000+ errors according to SEOMoz, and I'm down to 700. Here's a few things I did:
Use canonicals on everything you possibly can.
Redirect 301 the items in the SERPS that are identical.
I'm not familiar with Magento to help you work though that side of it.
Having a link like: domainname/leather-chairs-244-16-price-1.html would work much better.
The ones you have listed are because somehow somewhere you (the site) have a link to it.
Unfortunately some of the CMS's are written by developers who don't fully understand SEO and why the ? is a bad thing.
I went through both sites and noticed that you don't have canonicals on any of your pages. Also don't see any XML sitemaps.
If you can only show the businesses that are in that particular area, and remove the ones that are not I would probably go for that, unless you are doing Local Search as well.
I would be curious to see the example of canonical issue you mentioned.
Are these Joomla sites? I'm just asking because there are no description tags on anything except the home page.
I agree with EGOL. Are the pages being generated? How many search queries will end with the same information?
It looks like your doing URL ReWriting now since the url:
http://www.businessbroker.net/State/California-Businesses_For_Sale_test.aspx
shows the same information. Start adding canonical to all pages so that the above link will not be seen as duplicated content.
Without actually seeing the site I would say that if you have product pages that are identical to canonical them to one main place. If they are listed in the SERPS 301 them if you are removing them. Even having a few different words can still show up as duplicate.The last CMS option might work, but again without seeing the site it is hard to judge.
Since I do a lot of url rewriting I ran into a similar issue and did 301's and canonical to the pages I wanted to show and things are much better.
The few things I would point out, which aren't so much WordPress issues, but are how this WordPress installation is configured, are:
Cheers.
Aside from checking out the Yoast plugin, make sure your sitemap is up to date. There is a reason the bots are going to those pages, the question is why.