Seems like that's the end of that then. Could it be because of the same pressure from Google applied to Raven regarding their Google SERPs feature?
Posts made by Mulith
-
RE: Did the SERP Overlay CSV export disappear?
-
RE: Multi country targeting for listing site, ccTLD, sub domain or .com/folder?
Yes Tom, thanks that has helped.
I wonder whether a mixed strategy might be more effectiveby:
-
Using ccTLD for competivie markets subdomains
-
Subdomains or Folders for non competitive markets that can be swayed by by DA
When a market starts to increas ein competitiveness then a shift to a ccTLD can be done for greater targetting.
Would this make sense from an SEO point of view?
-
-
Multi country targeting for listing site, ccTLD, sub domain or .com/folder?
Hi
I know this has been covered in a few questions but seen nothing recent that may take into account changes google may have applied.
We would like to target multiple english speaking counties with a new project and I'm a little unsure as to whether ccTLD, subdomain or subfolders are the best way to publish country specific information.
Can anyone shed some light on this?
-
After penguin 2.0, 20-25% drop sitewide, no google unatural links message, What could be causing it?
Hi,Since Penguin 2.0 we've taken a 20-25% knock but not recieved an unatural link message from Google. After sending a bunch of removal requests, I decided to submit a disavow file anyway two weeks ago and tried to make sure I rooted out some links that were built way back when our site started and link building best practice was a bit shadier.
Analysis of our backlink profile points to about 40-50% links coming from general directories, wondering if perhaps their weight has been adjusted and this is why the drop occured? Having said that we have some high quality links from government sources and highly trusted sites so not too spammy.
Can anyone shed some light or offer suggestions?
Thanx
-
RE: Website pages missing from seomoz crawl
Is there any chance you could email me your sitemap as produced by wordpress? info[at]pathfindermedia[dot]co[dot]uk I'll take a closer look at whats being excluded.
-
RE: Website pages missing from seomoz crawl
Another possibility could be your robots.txt file, is it blocking some directories?
-
RE: How to handle long dynamic meta tags?
Looks like the only downside is google trying to algorithmically determine a more suitable title, which isn't that bad as these are not really high priority ranking pages.
Thank for the help.
-
RE: Meta keywords and meta news keywords
Hi Irina,
-
Meta keywords are unlikely to cause a penaly and are simply ignored. So you can do them but it's probably a waste of time.
-
news_keywords only works if you are an accepted news source in google news and it's likely that this will only effect news results. This tag was created to give journalist more freedom with their headlines so they don't have to keyword stuff the headline. It's meant to lead to a more natural looking article while still giving an option to give some keyword indication of the theme.
Hope that helps.
Mulith
-
-
RE: How to handle long dynamic meta tags?
Yeah I suppose. But what happens when this is on a large scale, would that have a punitive effect?
-
RE: Redirection Of Mobile Traffic
Might I suggest using screen size or user agent to trigger the option to the user something like:
"We've detected that you may be using a mobile device. Would you like to go to our mobile friendly site instead?"
Research shows giving folks a choice results in a better experience. Nothing more annoying that not having the choice, especially considering the wide array of screen sizes that are now being used. The live site may be more user friendly and you'll also get some valuable feedback on how popular you mobile site is.
-
RE: Are Blog Comments now useless?
Yes I'm afraid they are useless when it comes to incorporating them into your SEO stategy.
Better to take the energy you would have expended tracking down do follow comment opportunities and spend it creating rich content like posts, guides, infographic(high quality only) and useful widgets.
Yes it's harder now than pre Panda but good in a way because the barrier for entry is that little bit higher and that gives you an advantage over new starters who want to cut corners and steal your customers.
Good luck, you're going to rock!
-
RE: Website pages missing from seomoz crawl
Hi Ovieira,
This is not necessarily an indication that there are pages that are hidden from crawlers and the missing pages could simply be low priority for the moment. Or could have been created after the initial crawl had taken place.
The best way to check is to run a crawler like http://www.xml-sitemaps.com/ and that will give you a better idea. If the sitemap generates a complement of your pages then it's probably just a case of waiting until the next Moz crawl.
Mulith
-
How to handle long dynamic meta tags?
Hi All,
I have a site that has upwards of 40 000 pages and I'm redeveloping it so really want to get some SEO elements spot on for the new development. Hoe do I go about handling the following:
-
The user creates a title for their advert which I use as the meta title. The problem is titles are quite often longer that the accepted lengths. How should I handle this? String manipulation down to the desired size, leave it as is or is there another solution?
-
The meta descriptionn is pulled from a summary they created as part of their profile. Is this the right way to do it?
Any advice would be appreciated.
Ross
-
-
RE: Competitors Dodgy Link Profile - How to handle?
No I meant point them to the competitors site/
-
Competitors Dodgy Link Profile - How to handle?
Hi All,
One of my competitors doesn't seem to mind collecting links with no relevance and over optimisation of anchor text also leaves me scratching my head as to ho wit's possible for them to still get the rankings they do.
My question is how do I handle them from a competitive point of view? Do I pay for a batch of poor links to topple this competitor? Or do I bide my time?
Thanks
-
RE: Sounds too good to be true?
It's so tricky though because on one end of the spectrum you have companies promising the world and on the other end you have companies building 10 - 15 links / month for the same budget.
Is there a good resource somewhere to find a SEO business partner, that sort of insentivised approach is starting to appeal to me more than paying a company with nothing to lose?
-
RE: Sounds too good to be true?
Yeah they did ask about target keywords and promised that they can get our site business4sale.co.uk to No.1 for "sell a business" in google uk within 6 months.
Current number one has huge DA so was also a bit sceptical.
-
Sounds too good to be true?
Hi all,
Speaking to an SEO company at the moment about doing some link building for me but I just can't shake this suspicion that they are a bunch of cowboys.
My budget is £1000/month and they are promising 500-1000 high quality links/month. Common sense dictates that surely that would trigger an unnatural link building pattern and at £1-2 /link doesn't sound like they are going to be quality.
Is there any scenario where these figures might stack up. Personally I think it's bullshit but thought I'd check it out before telling him to piss off.
Thanx
-
Should I remove paid links?
I recently added about 20 paid links from directories but have since seen a 10% drop in traffic. I did also delete about 1000 pages of content that had no inbound links and were duplicated on other sites on the web and replaced the content with new content supplied by a client but still duplicated on other sites on the web, old URLs no longer valid or linked to, new content on new URLs.
Assuming the drop in traffic had nothing to do with the content change mentioned above, should I remove the paid links in an attempt to recover? I don't think the old content was bringing in much traffic as it appeared elsewhere on more authoritive sites than mine.
-
RE: Why did my rankings drop?
A closer look at the Analytics: Yahoo traffic seems to be down by 60%, especially on long tail terms. The shift happened on the 4th August. In the 10 days prior I had added about 10 paid submissions but I'd also updated a clients listings which meant dropping about 1000 pages of older content and replace it with fresh listings supplied by the client. I wonder!
Overall there is a slight reductions in most keywords but as you say this could be because competitors are building links too.
-
Why did my rankings drop?
Hi all,
In July I started to re-energise my link building efforts by getting a proper campaign together to build links.
Despite building about 20 new links my traffic has actually fallen. Here a a breakdown of what happen:
1)Late June I noticed my toolbar page rank up at about PR4 which, despite only being a small part of the algo, was nice to see.
- Early July I started my link building campaign by getting together a massive list of potential link partners by using Open Site Explorer on my competitors sites.
3)Because I'm a bit pressed for time I decided to go for the easier links first. I sorted my link list by Domain Authority and started to list on high DA directories used by my competitors. I listed on about 20 of these directories. I also livened up an old links page I'd previously hidden from the SE's because I was planning to do a bit of Link exchanging too.
-
A few days after I started building links from these directories I noticed my traffic start to drop off gradually. I also noticed the toolbar PR go down to PR3.
-
I decided to stop at 20 submissions because it looked like this was effecting traffic. I also removed the links page I'd livened up which produced a temporary improvement in traffic but it's since gone on to get a bit worse.
Traffic is now down by about 10% on when I started buying submissions to directories. I must add that during this period we have also been taking on new clients which, as a a real estate listing site, means we put loads of content on our site for the client. That content is also on the clients website and on other competitors sites. So there would be lot's of a content that appears elsewhere on the net.
Not really sure which of the two has caused the problem and not really sure how to progress. Do I remove the links on the directories? Do I wait for this newly added content to bed down so that new fresh can take it's place in our results which we rank from?
Any help would be appreciated.
-
RE: How to manage duplicate content?
The results pages do have unique meta tags that are dynamically constructed(due to large amount) for onpage SEO and the results pages are rewritten to static urls for indexing.
My results pages actually don't do too badly considering but not sure if the dup content negatively impacts the whole site by way of some unique content vs. dup content sitewide ratio or something.
I encourage users to create a unique 200 character summary for the search results which does help but with over 10000 listings, I think it may be a challenge to get a copywriter to cover it all. Another downside is taking on new clients who may hav a portfolio of 100's of properties. To get them onboard we either create a crawler to retrieve from their site or use a XML document generated by them and distributed to our site and those of our competitors.
I'm hoping we won't be punished for the dup content we just won't rank for it which is fine, but that's just a guess.
We can write unique content though our copyrighter but would it not be better to create a few paragraphs of unique content for each results page. Granted it will take a long time to cover all the pages but the focus would be on improving the ration of unique content vs dup content.
-
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries.
The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate.
What strategies exist to ensure that I'm not suffereing as a result of this content?
Should I :
-
Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole?
-
Link back to the clients site to indicate that they are the original source
Any suggestions?
-
-
RE: Exact match domain marketing?
I suppose if you keep it away from the main site then you've got some form of insulation against a penalty. Buying links like that surely is quite expensive?
BTW noticed you're from SA too, small world!
-
RE: Exact match domain marketing?
Buying links - No Offence intented but I'd have to be pretty desperate for that to happen and reciprocal aren't likely to pass much juice despite the exact match anchor text.
The domain is: http://tinyurl.com/3wohp5p
Using google.co.uk amd serps "sell a business" "selling a business"
-
RE: Use of + in url good or bad?
Yup better to use "-".
The + sign along with the & and % operands can be problematic for spiders and in some cases browsers too.
-
RE: Exact match domain marketing?
Thanks Janml for the extended comment.
If I'm honest the strategy is short term and allows me to generate the income needed to get link building done for the main site. So although it may seem short term, there is a long term goal in all of this.
I'm already 3rd-4th for the keyword but the real conversions happen at the top because No1 is a very high DA site and if I'm completely honest with myself, despite all the best content in the world, it will take me months, if not years to get there. I'm banking on the 0.25 correlation between exact match and No1 rankings, combined with a quick link building push to get to the top.
Since employing link builders costs money this seems like the best way to get the dosh together.
Pitty link builders don't accept equity based on the traffic or market share they bring in. That would remove the need for doing this the long winded way.
-
Exact match domain marketing?
Hi All,
I've managed to secure a couple of exact match domains that are closely related to my main site but I'm a bit unsure of how to make best use of them.
My plan is to generate revenue by ranking high for the exact match terms, moving the visitor onto the main site to take their details and payment.
Naturally I want this satelite site to be very similar so that I can keep the brand continuety going and maintain a smooth transition to the main site. My main concerns however are:
- Will I get punished for using the same Navigation menu, logo and code as my main site? The navigation menu will link to pages on the main site, and this satelite site will most likely only consist of an index page and about 5 - 10 individual content pages.
2)I want the branding on this satelite site to be the same as my main site. Will I be punished for the difference in branding vs domain name? I know the SE's are unlikely to pick it up because the logo is an image but is there a risk of human editing?
- I want to link from the main site to the satelite site to give it a bit of a boost and to get through a possible sandbox. If the satelite has numerous links to the main site and the main site is linking to the satelite site will there be any benefit?
Never tried to create a site like this so a bit nervous about impacting the main site.
Thank in advance.
-
RE: Is it better to drip feed content?
Thanks guys for your help. Think I'm going to publish it all at once. Was originally in agreement with Bill but after doing a bit of reading it's probably safe to say that the SE's prioritise good content over content age. I've noticed blogs having slightly inflated PR because of the regular content but it's unlikely I'll be able to keep up regular posts and as a result any benefit derived from drip feeding would fall away when I run out of articles. If it doesn't work I'm calling my lawyer on you guys, hehe kidding :)))))
-
Is it better to drip feed content?
Hi All,
I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles.
Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content.
Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?**
Thank you in advance.
-
Will swear words present on my pages affect my rankings?
Hi There,
I am in the process of formulating a listing policy for my site and I'm not sure whether I should add something in there for swear words.
My site is an adult site and swear words come with the territory, unfortunately. Will user generated content with swear words affect my ranking?
Thank you
-
RE: How to get subdomains to rank well?
Hi Kayden,
Perhaps it's a better idea to get to the top first and then role out the subdomains. The main drawback here is loss of traffic when you do the 301 redirect.
Ross
-
How to get subdomains to rank well?
Hi All,
I am setting up a new site and I want to make use of subdomains to target multiple countries as follows:
etc.
Now i know what you're all going to say, why not use folders as they are more effective. Well I did think of this but decided against it because I would like to make the best of a low competition industry.
I want to push my competitors as far down in the SE's as possible and i plan to do this by targeting generic non locational search terms with both sites so I can hog the top 4 spots.as follows:
uk.mydomain.com/keyterm-in-the-UK
Whats steps can I take to ensure rank passes to my subdomains?
Is it better to start the site with folders like www.mydomain.com/us/keyterm and then 301 them to subdomains at a later stage or should i start with the subdomains?
-
Duplicate content handling.
Hi all,
I have a site that has a great deal of duplicate content because my clients list the same content on a few of my competitors sites.
You can see an example of the page here:
As you can see the search results are on the right. A majority of these results will also appear on my competitors sites.
My homepage does not seem to want to pass link juice to these pages. Is it because of the high level of Dup Content or is it because of the large amount of links on the page?
Would it be better to hide the content from the results in a nofollowed iframe to reduce duplicate contents visibilty while at the same time increasing unique content with articles, guides etc? or can the two exist together on a page and still allow link juice to be passed to the site.
My PR is 3 but I can't seem to get any of my internal pages(except a couple of pages that appear in my navigation menu) to budge of the PR0 mark even if they are only one click from the homepage.
-
RE: Pagination and links per page issue.
But would rel canonical make listing on page 2 and above unindexible?
Listings are added chronologically on my site and I still want crawlers to be able to reach adverts created years ago, these listing a could be on page 100, surely rel canonical tells engines not to crawl the page as it is not the canonical version?
-
Pagination and links per page issue.
Hi all,
I have a listings based website that just doesn't seem to want to pass rank to the inner pages.
See here for an example:
http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK
I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links.
The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results.
My question is:
Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)?
And would using rel canonical on page numbers greater than 1 result in better trust/ranking?
Thanks in advance.
-
RE: Link building maximum to different sub domains?
Hi Aaron,
I was hoping to offer free advertising in exchange for a link which is quite common in my industry.
Could you expand on what would be a poor way to build links so I can make sure I steer clear.
Ross
-
Link building maximum to different sub domains?
Hi All,
I'm launching a new website with a number of country specific sub-domains and I wanted to know if Google will calculate the number of new links as a root domain or if it will treat each subdomain seperately?
For instance if I built 50 links per month to each of my five proposed subdomains would google see it as 250 links built to one root domain(and penalise me as a result) or will they view these subdomains independantly and accept these 50 links per page as an acceptable amount per sub domain.
Thanks in advance.
Ross
-
RE: Maximum links per month?
Sorry one other thing. Links would come from sites with no more 100 pages each so few major sites with high DA.
-
RE: Maximum links per month?
Hi Marcus,
I don't want to go into exact specifics, but the site is in the adult industry and links are obtained by giving free advertising in exchange for a homepage link. There are normally about 5 outbound links to other sites on these sites that are similar in nature. Some are dofollow banners(no anchor text), some are text based.
I will be writing a script to alternate anchor text between different target phrases and pages so that I can avoid looking like an affiliate link. One thing that does worry me is that a big chunk of these links are reasonably poor quality as SEO is not really used by my patential link partners.
My focus would probably be first quality then quantity though.
-
Maximum links per month?
Hi All,
I am setting up a new website with a fresh domain on an existing server. I can foresee that linkbuilding is pretty easy in this sector and it wouldn't be that difficult to build as many as 200 links a month.
What I wanted to ask was whether it is possible to put a figure on the maximum number of links I could build without incurring a spam related penalty?
Could I for instance start with say 50 links / month and then in increase that by 25 links / month for the next couple of months to seetle for something around the 100 links/month mark?
Thanks in advance
-
Can I use the same source for two different websites?
I have developed a successful portal based website but would like to grow my portfolio of sites by expanding into new niches and sectors.
I would like to use the same source code to fast track new sites but I'm not sure of the dangers involved. Content, meta details etc. will all be unique and the only similarity will be the html code.
Another example of how I want to use this is that my current site targets the UK but I want to target a global market with a .com domain and this would involve using the same source.
Is this possible without a penalty or am I overlooking something?