I don't foresee this being a problem.
At all.
And I know what you mean about being scared of your own shadow
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I don't foresee this being a problem.
At all.
And I know what you mean about being scared of your own shadow
Since sometime around 2009 Google will replace your URL in the SERP with a breadcrumb if they feel it adds a better user experience.
In itself this isn't an issue.
That it takes you to the wrong is odd but I've noticed it happens to a couple of you other pages as well. At a guess I would say that if you complete the breadcrumb so that the last section points to the URL that is being displayed (see this page for more detail) it will solve your problem.
Hope that helps.
Yeah, that's a big steaming pile of lies.
If your site is particularly slow then expect to have your rankings reduced however 1.5 seconds is very fast page load time.
If anyone has evidence to the contrary then I'd love to see it.
Have you recently made any changes to an affiliate program or banner ads campaign?
An extra 100,000 links is actually pretty easy to come by - it could be one site with dynamically generated pages linking to you. I wouldn't be too worried about this but you may want to look into it for your own peace of mind.
With regard to your linking root domains dropping - do you have any records that show what type of sites have been removed? It could be that you had a lot of directory links which Google has removed from the index. If this drop in LRDs hasn't already led to a drop in rankings then, again, I wouldn't worry about it but you may want to look into it further for your own peace of mind.
Google doesn't really look at the meta description anymore so I wouldn't stress about it.
Although personally I'd not include the domain name in the meta description as you're probably mentioning the brand name in the title tag and it's going to appear directly above the description in tiny green writing anyway. So why not fill your title tag with things designed to encourage a high click through? Things like "Free shipping on orders over $50. Widgets available in 30 colours including popular colour 1, popular colour 2 and popular colur 3."
I'd say do your best to get the links removed/fixed at the other end but other than that just leave it.
So long as you're returning a proper 404 you're doing the right thing (on paper).
Bloggers and influencers - yep. You want these people to share your content if at all possible. I've found that some bloggers won't link to your content if it's on a commercial site however a lot of them are willing to Tweet it if it is great content.
Have you considered paid exposure? Facebook ads are pretty cheap and a small Google PPC budget could be used to target interested searchers.
If it's a city guide have you considered getting in touch with whichever municipal body is in charge of tourism in that city? If it really is the best content out there then you might be able to convince them to share it.
Is it going to take longer to clean the site up than to start a new one?
If so, I'd bite the bullet and start a new one.
You could choose a single version of the product page to act as the main version and implement rel=canonical tags.
The downside to this is that (in theory) only that version of the page will rank.
If it's really important to have the different page variants rank for different search terms then you'll have to start rewriting the product copy.
I'm assuming that the term you're targeting is "fotograf i odense".
I'd recommend running your homepage through the SEOMoz on page optimization tool.
You're showing up within the top 100 which is a good start. If you can build some links to the site you should quickly rise up to the top 50.
Although I should warn you that until you get to page 1 (and maybe page 2) you won't really see a large increase in organic traffic.
That's a good indication that they're not worth buying. If you're going to be buying links then you might as well pay for a guest posting service to get some good links. I don't use one myself but I know people who do and it seems to work.
No.
However if all or most of your links come from resources/links pages then I would hazard a guess that you will leave behind another footprint for which you may get penalised.
As a rule of thumb you should try and be on well curated resource pages. So if the webmaster will link out to anyone for a small fee then they're probably not worth getting a link from. If on the other hand it's pretty hard to get featured on their resources list then there's more of a chance that the link will actually mean something.
Hi Karen,
8 months is a long time when you're in the throes of Google. Your problem might not revolve entirely around your navigation.
However in answer to your question - are they the right subpages?
Personally I'd do a card sorting exercise to determine which sub categories your users are expecting to see. You can normally back this data up with some keyword research to see which sub category type terms are generating a lot of search volumes.
For example if you've got a page called "English Literature 1800-1850" and more people are searching for "Romanticism period in English literature" then maybe you've got the wrong subpage.
It is a bit of work, especially joining up the keyword research with the card sorting but at least you'll know that you've got the right sub categories.
Hope that helps at least a little.
It is possible to redirect users based on UA (user agent). However Google now recommend using responsive design so if possible you'll (probably) want to do this
I'd work at him from the angle that it is possible to be both professional and personable. If you pull out editorial from a high quality newspaper or even cite some influential blogs in a space he follows you should be able to put that point to him pretty convincingly.
Alternatively I think Will Critchlow recommends hiring people on Amazon Turk to answer a Panda-style questionnaire. You could compare the current text with that written in the first person and see which one real people prefer. Or pitch the two versions of the text head to head in an A/B test and see which has higher user engagement statistics.
If you're only doing the blog to generate links/shares etc. and he insists on using his own writing style then it sounds like it would just be a waste of his time.
One sitewide link probably isn't going to hurt you too much. On the other hand it's probably not going to help you too much either.
It is worth pointing out that removing the no-follow from the banner would be a violation of Google's T&Cs regarding paying for links to manipulate Page Rank.
If it's just the title tag on the pages then you might as well try swapping out one or two and testing your changes.
If those titles affect the URL of the page or of any other pages then you'll need to consider these changes. Changing URLs is the same as creating a totally new page so obviously this is a bad thing.
If it is literally just a title tag change...I'd try it, you can always roll it back after a week.
1. Changing the information architecture could cause issues - I'm assuming that you're not doing it lightly and that you're making changes for the best. So long as you're aware of what you're doing then you should be fine.
2. Tons of new content is good- if it's good content.
3 & 4. Sounds exciting!.
Generally speaking most of the data SEOMoz has about your campaigns is stuff that a determined competitor could find out anyway (and probably for cheaper than bribing Rand).
Plus if Matt Cutts wanted to get data on you he wouldn't have to go through a 3rd party - he'd just send around a streetview car
So is example.com ranking for terms that example2.com used rank for? If you check your analytics you should be able to see if those keywords are driving traffic.
I would doubt that it's solely the domain name causing this.
How far have the rankings slipped for exmaple2.com? Have all rankings dropped the same amount? Are head terms affected more than longtail terms? Does it still rank for it's brand name?
Structured data (microdata etc.) is markup that specifies what a section of a web page is about. So for example you can markup a review so that Google can identify the star ratings and knows that the product got a 4/5.
This type of data is part of the semantic web - which is a WWW where bots like Google stop seeing Sites and start seeing entities. So for example in the current web a search engine might see links to a site with the anchor text "shoes" and interpret that to mean that the site is relevant for "shoe" based queries but if those links went to that brand's Facebook page the connection with shoes would be a lot weaker. In a semantic web the search engine would be able to tell very easily that a brand's Facebook and Twitter pages are part of the same organisation and that links to either count the same as to the main site.
That's a pretty crude example (and to some extent search engines are already doing this) but you can see how it can affect SEO. That's not to mention the benefits that you can be getting right now from having rich snippets (Google them, they're cool).
It depends on the purpose of the video. I'd put informational videos onto Youtube just because of the traffic floating about on the Youtube platform.
I'm not an expert on video SEO but if you read this then you will be.
If you export to .CSV and put it into Excel the referring page (referrer) for the 404 error can be seen in column AM.
That should be enough to fix it although if it's a dynamic site it can end up being more complex than that.
I've got a couple of recommendations.
First off you can demote a site link in Google Webmaster Tools. You can find the option in configuration/sitelinks. Although Google is obviously assuming that the pages with current site links are important somehow so it might be a good time to recheck your site architecture.
Secondly you should probably remove the no-follow attribute from internal links on your homepage. They won't be having a beneficial effect and could negatively impact how search engines view certain pages on your site.
Hope that helps.
Hi Dana,
I wrote the following after assuming , for no reason at all, that you didn't know much about SEO. However having looked at your profile I realized that I was wrong and that my tone is probably a little patronizing. That being said it's 1am over here and I really don't want to rewrite it so please accept my apologies.
If I had to guess (and it is a guess as I'm not technical) I would say it was some badly formed links.
You know how some of your error pages have an Origin parameter (like this one) that say where the page was generated? Well these URLs follow the same format as the error pages that you're finding. It looks like rather than using an absolute link (like http://www.ccisolutions.com/page) the onclick action is actually generating a relative link (so just /page).
When you use a relative link your site adds the partial URL (/page) onto the end of your domain to give you a full URL (http://www.ccisolutions.com + /page = http://www.ccisolutions.com/page). It looks like you're using relative links as if they were static ones. Which is why you have "www.ccisolutions" in each URL twice.
If I had to blame anything it would be whatever is powering your IAFDispatcher however as I haven't been able to replicate your problem I couldn't be certain. If you can track how these URLs were generated by looking at the preceding pages that are sending traffic/bots to them then you should be able to narrow it down to which links are broken.
Write unique content for your top 100 pages. Wait a month to see the results and then decided whether it will take too much time to write all those pages.
It's counting ALL the links.
Reducing the number of links on the page is best practice but I don't think there's a penalty for having too many links.
If you're struggling to rank the issue is probably elsewhere.
Unless there are other factors limiting you I would probably bring all of the variables onto a single page and treat size as a variable similar to the way you treat colours.
So you don't currently have separate pages for Burgundy and Caribbean Blue covers so why not do the same for size? You would want to adjust the copy to indicate that the product is available in different sizes and obviously this would require reworking the page but if you're going to create a parent page anyway then you might as well.
Hope that helps.
200 points in a month gets you a free month of the Pro service.
500 points total (no time limit) gives you access to the Q+A section without having Pro access.
At 200 points you get a followed link in your profile.
At 1000 points you get a t-shirt
At 2000 points you get a trophy.
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
#1 for everything is always a start.
Increasing long tail traffic is always an option (assuming you don't already have plenty of longtail traffic).
Depending on the industry it might be an option to increase traffic from referring sites or social media not to mention affiliate marketing.
Twitter. You only need to find one decent copywriter as they tend to hang around in groups.
And depending on subject matter you can also try hiring bloggers.
As part of a balanced campaign I don't see a problem with getting links from dog blogs.
In your shoes I'd make sure I had other more relevant links as well but if the dog blogs are easy enough to get and you just want a few links then go for it.
I use BuzzStream to track link building in the way you mention but there are other services out there. Even a simple Excel spreadsheet can do the trick.
It's more of a copy issue as you mentioned. So long as the content on the pages is unique you shouldn't have a problem.
They're two different URLs.
If the URL changes but the content stays the same then it's classed as duplicate content.
I feel your pain though - the amount of duplicate pages I've ended up with just because copywriters like to capitalize their words...
You'd be hard pressed to find any automated directory submission that's white hat. In fact auto directory submissions are pretty much the definition of crap hat SEO.
This soon after Panda you'd have to be particularly brave to try en-masse directory submission. I'm not saying that it won't work or that it will have a negative impact on your site but I certainly wouldn't try it.
In answer to question 1 - yeah. Happens all the time. They're spammy sites that are scraping the serps to get some relevant content. There is nothing that you can do about it and personally I wouldn't worry about them too much.