I recall seeing a question you asked about Google + a day or so ago, but I did not know the answer to it. Sorry.
I would suggest you at least provide a link to the question otherwise no one will find it.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I recall seeing a question you asked about Google + a day or so ago, but I did not know the answer to it. Sorry.
I would suggest you at least provide a link to the question otherwise no one will find it.
DA is a very useful metric, but the tool's limitations must be understood. There are several:
The main thing to recognize when using DA is it's based upon the Linkscape crawl of the web. SEOmoz is working to improve the tool, but there are growing pains. For example take a look at: http://www.micrositez-seo.co.uk/
If I recall correctly, that site had a DA of 8x. Due to issues with the last crawl, the site's records were not captured correctly and it presently shows a DA of 1. Any issue with Linkscape will cause an issue with DA.
Also understand Google crawls every site, but Linkscape only crawls the top 25% of web pages. Google will almost always show more links then reflected in your DA.
Another important note, DA is mainly a measure of link counts based on the linking page's DA/PA. As you pointed out, Google will discount spammy links and eliminate their value. Linkscape has no way of knowing what links Google, Bing, Yahoo, etc have discounted, nor what sites have been penalized. The DA will continue to show link value.
The outcome of your test was predicable. If you add 100 links from JC Penney at the time when the site was under a penalty, the DA would have improved for the site receiving the links. The DA does not use "spamminess" or penalties as ranking factors.
In summary, yes DA is a very valuable tool in analyzing websites. Like most tools, it is important to thoroughly understand how the calculations are performed without the tool, then how the tool itself works. Using DA in combination with that knowledge and experience can make your job a lot easier. If you try to base your SEO decisions blindly on DA without the required understanding, you wont be happy with the results.
There is not any published standard on how frequently Google will discover new content on your site. There are too many factors ranging from your DA (which is not actually a Google metric), how deep the page is in your site, how well linked the page is both internally and externally, etc.
It could be that Google was planning to visit your site, you post an article, and they find it right away. It could also happen then Google just left your site, you post an article and they may not find it for a week or more.
The normal rules of SEO apply. If your article is important then either add a snippet to your landing page or at least link directly to it from your landing page. Work immediately to earn links to the article, tweet it, etc. Send Google an updated sitemap with a link to the article. All of these activities can indicate to Google your content is important and should be indexed.
This is normal Google behavior. If you attempt to search for a page's cache using the "cache:" prefix, then if Google does not have the cached page but does have the page indexed, they will share the normal search result.
When Google crawls a page it is immediately added to the index, assuming they decide to index the page.
Want proof? When I look at this Q&A I see "Posted by Atul Sharma about 1 hour ago". When I search Google for this Q&A title I find this result.
Keep in mind SEOmoz has a DA of 87 which is why Google crawls this site so frequently.
I am not aware of a tool which does what you want. You can take the Open Site Explorer link report from both sites and download it to an Excel file. In Excel you can compare the fields to determine which ones are unique.
This is a basic configuration issue. I would advise you to contact your sitemap software vendor or whomever wrote the code. You need your sitemap software to view the "www' version of the site as those are the pages which display the links.
It is correct that DA, PA, depth of pages, etc. are all factors in determining which pages get indexed. If your site offers good navigation, reasonable backlinks, anchor text, etc then you can get close to all pages indexed even on a very large site.
Your site map should naturally include a date on every link which indicates when content was added or changed. Even if you submit a 10k list of links, Google can evaluate the dates on each link and determine which content has been added or modified since your site was last crawled.
The action I would recommend would depend on two factors: are there backlinks to the page, and how frequently are the links used (check Google Analytics).
If you have a page without any backlinks that is rarely used, then I would allow the 404 to occur. If you have a page with link juice and is frequently used, I would make an effort to ensure user's are happy with what they find when clicking to the page, otherwise they may bounce off my site. In this case I would rebuild the content.
404s are a natural part of the internet. Content is moved and removed over time. It is not necessarily a bad thing for a 404 to occur if your site is prepared to handle it (i.e. your 404 page is helpful and friendly).The only point I am making is some webmasters view 404s as an issue with their site that must be resolved, and that is not the case. You should be fully aware of any 404 links and be able to adjust where necessary.
You are seeking to have your client's site be considered relevant for multiple geographic areas. A few suggestions:
1. Seek links from those areas. Receiving multiple links from a website or page associated with a specific area can help your page be considered relevant for that area.
2. Allow User-Generated Content to be generated on your site. Forums, response to blog articles, etc. Users will often add plenty of content which will drive content.
3. Add testimonials from users in those areas. "Service was great! John - Beverly Hills, CA". The addition of the location to the testimonial helps to establish relevancy for that area.
4. If there are only a few areas, then feel free to add a list to your content. "We service the Los Angeles, San Diego and Beverly Hills area" is fine. When you list a dozen areas, then it would probably seem spammy to add a list.
what could cause one page to drop so much so quickly whilst other pages improved their rank?
Without knowing the page URL and keyword, we are left to guesswork. As Sean suggested, the most three likely issues are either:
1. Google has made an algorithmic adjustment which was unfavorable to that page.
2. That page has been penalized in some form. It could be there is an issue with the page's content, or perhaps a link to the page has issues and has been devalued.
3. There is a crawl or other issue preventing page rank from flowing properly to that page.
SEOMOZ use a different method to assess an external link, then no problem
SEOmoz does use a different method. The links you see are based on the Linkscape crawl of the web which is updated approximately once per month. The Linkscape database is based on the top 25% of internet pages. If your links are from pages which have low PA/DA, then they will not appear in Linkscape and therefore, they will not show in the SEOmoz tools.
The Linkscape index is normally pretty good but you should understand it can take two months for a link to appear depending on when a page is crawled versus when a link is added. Also know there were some issues with the last crawl which should (hopefully) be resolved with the next results.
If your SEO company can provide you a list (Excel?) of the 331 links, you can compare them to the links from Yahoo and determine the discrepancy.
Your sitemap should include every page of your site that you wish to be indexed.
The idea is that if your site does not provide crawlable navigation, Google can use your sitemap to crawl your site. There are some sites that use flash and when a crawler lands on a page there is absolutely no where for the crawler to go.
If your site navigation is solid then a sitemap doesn't offer any value to Google other then an indicator of when content is updated or added.
The amount of link juice you will receive is the same in any of those examples. Your question is similar to "which weighs more, a pound of feathers or a pound of lead".
A link's value is determined primarily by the linking page's DA/PA/ or PR (however you measure link value) and the number of links on the page.
There are other factors such as a link in the content area of a page offer more value then links in footer areas for example. Based on your question it seems the link will appear in the content area of your page.
We know Google associates anchor text to a link's value, but I am not aware of that text influencing the amount of value that is passed.
Make sure your site's 404 page is friendly and helpful. The message should be something along the lines of "we are sorry the page you are looking for cannot be found" along with links to your home page and other helpful pages. A search box should ideally be provided. Additionally a panel which displays links to your site's top pages or most interesting content.
Your options are:
1. If there are links to the outdated show pages and the pages aren't accessible, you may want to restore the old pages. This will keep users and search engines happy. The links will continue to offer your site value.
2. If there are links to the outdated show pages and the pages have been deleted and not recoverable, you can 301 redirect the pages to the most relevant page on your site. Perhaps another show from the same director, or a show of the same type. This will allow your site to retain the link value within it's DA.
3. If there are not links to the outdated show pages, I would recommend allowing the pages to 404.
If you perform any action other then the first, then the rankings #3-6 will disappear. The search engine crawlers will recognize the pages are no longer available and remove the results.
There is not any "acceptable" number. You would write the minimum number of rules possible. That number can range from 1 to the total number of pages you currently have on your old site.
As Gary suggested, moving your blog MAY be beneficial depending on the circumstances.
If your site focuses a specific topic, and you wish to add a blog to talk about that topic in further detail, the blog should definitely be on the same domain as the main site. On the other hand, if you wanted to add a personal blog to talk about various topics unrelated to your main site, then that blog should ideally be on a subdomain or separate site.
Many people get confused with the concept of subdomains. A subdomain is a separate website. Think of wordpress.com. It offers thousands of subdomains all managed by different people. The subdomains have no relation at all to each other nor the main site.
Adding a blog to your site can definitely help increase traffic if you add quality articles to the blog. Adding it to your subdomain instead of your main domain is not going to offer any additional traffic benefits. In fact, it will most likely be less because your subdomain will start with a DA of 1 rather then inheriting the main domain's DA. The articles will rank lower in SERPs and receive less traffic.
If you are going to change URLs, there are two preferences:
1. To change URLs in such a manner that your URLs can map from the old URLs to the new ones in as few rules as possible. If you have 5000 URLs, you would strongly prefer not to write 5000 rules. Ideally you create some logic to map the old URLs to the new ones. For example:
Old URL: mysite.com/category/product-id/product-name
New URL: mysite.com/product-name
A single rule can be written to transform the old URL to the new one.
2. You want your new URLs to be "future-proof". We know that technology changes so remove any .html, .php or other extension from the end of the URL. They are not helpful to users and will cause a redirect to be needed any time technologies change. GOOD URL: mysite.com/product, BAD URL: mysite.com/product.html
Also take some time to really think about your URL structure. Anything that changes over time should ideally be removed from the URL structure.
Your current sitemap can be seen here: http://www.vacatures.tuinbouw.nl/sitemap.xml
There are dates as recent as today in the sitemap. I did a search of the sitemap for "medewerker" which is part of the URL in the example you shared. There is a link with that term, but it is not the same link. There is clearly an issue with the configuration of your sitemap software.
A possible cause of the issue is your site is set up as both the "www" and non-www version. The pages on the two versions are not identical.
http://vacatures.tuinbouw.nl/vacatures/
If you look at this version of the URL you shared, the same page appears except it does not contain all the links at the bottom. Perhaps you configured your sitemap to view the non-www version of your site which does not show those extra links.
Very nice article from Danndy Sullivan you offered Anthony. Thumbs up for sharing it.
No.
Any ad links should be nofollow'd. Google certainly follows the rules with their own ads.
What is the topic of your site? Are there any real-world events going on related to your site that could explain this change in traffic?
Are there any seasonal factors for your site? Any sales or discounts? Is your site in any way related to "back to school" activities?
Yes, tiny urls such as bit.ly are basically 301 redirects. They pass most of the PR as would any other 301. This has been tested, confirmed and is otherwise thoroughly understood.
Some directories offer a clear deal: pay us and we will list you. That deal is a violation of Google's terms. Google offers a site so users can report any violations: https://www.google.com/webmasters/tools/paidlinks?hl=en
Some directories offer a less-clear deal: pay us to review your site. If we approve your site you are listed, and if we don't approve your site then your money is refunded. Google clearly understands this deal was primarily designed to bypass their policy and views these directories as paid links. Use of these sites can lead to penalties.
There are a few high-quality directories which have legitimate standards and charge a fee for a site review. If you pass, you are listed and if you do not pass, you are not listed. The fee is gone either way as it was for a review of your site, not a listing. Sites such as Yahoo Directory and BOTW are the two most prominent directories of this nature that come to mind. There are many niche directories that offer a similar process.
In short, paying for directory links is a bad idea. If a client has funds and is starting a site then I can see the value of listings in Yahoo and BOTW initially. Niche directories can offer value. Otherwise, the only way I would suggest paying money to be in a directory is if you felt the actual traffic you would receive from the link was worthwhile. If your client sells wines and you were gaining a listing in a wine directory that was actually used by people interested in purchasing wines, then that listing could pay for itself.
Another point would be certain groups such as the BBB may offer a value for being a member. The BBB badge on a site is a trust symbol which has been proven effective to increase conversions.
For the domain name, www.hotelsinboston.com would be the preferred choice.
For the URL, you are correct if you had mysite.com then the preferred page link would be mysite.com/hotels-in-boston
The reason is more to do with users then with rankings. If you say "go to hotelsinboston.com" then close to 100% of users will understand your request and land on the correct site. If you say "go to hotels hyphen in hyphen boston.com" you will lose a certain percentage of users who will forget or not understand the hyphens.
It's the exact same idea as why a .com address is preferred over a .net or any other TLD. If your site is hotelsinboston.net, a certain percentage of your clients will wind up on the .com site. It's just more natural for people to add .com to any business name.
The official answer I provide is to check the SEOmoz resource directory.
Based on a bit of feedback I have received it seems many are not happy with that direction. I understand as basically it is a listing of thousands of SEO companies without any particular endorsement.
In an effort to not ruffle any feathers I have been offering the recommendations in e-mail but I will go ahead and make one attempt at offering these publicly. In no particular order:
1. Distilled SEOmoz partners with Distilled for SEO consulting.
2. Keyphraseology Lindsay used to lead the SEOmoz consulting team.
3. Alan Bleiweiss Alan was doing SEO before Al Gore invented the internet. He often shares his knowledge and experience here at SEOmoz or speaking at various SEO conferences and his blog.
As for me, yes I am presently booked up. I expect to have limited availability beginning in September.
There are simply too many factors to evaluate in a general Q&A. A professional SEO would be able to perform a complete evaluation of your site, your on-page SEO, your competitors, your keywords, etc.
I understand your frustration. I don't think our taking guesses will offer you any assistance. In case I am wrong, a few more ideas are:
take every step to improve your on-page performance. Improve those "B" grades to "A"s, build your internal links, add relevant sidebar links to similar topics, etc.
let the world know your articles exist, keep social buttons near the articles, accept user generated comments on your articles, etc.
perform a crawl test of your site. Make sure your navigation is solid, you don't have duplicate content, etc.
EDIT: I just noticed the link to your site. There is plenty of improvements which can be made to the site.
1. Home page title "The Catholic store for First Communion Gifts, Confirmation Gifts, Catholic books, Catholic bibles and rosaries." It reads more like a meta description then a title. Pick one or two keywords to target. A preferred title would be something like "Catholic Store | Aquinas and More"
2. Your URLs are long and unfriendly. Example: http://www.aquinasandmore.com/category/2467/fuseaction/store.BrowseCategory/productsperpage/24/layout/grid/currentpage/1
A preferred URL for that page would be http://www.aquinasandmore.com/ebooks/
3. You are using 302s on an internal link. Don't use any 302s! If you ever choose to use a 302, it is meant for a very temporary use such as 7 days or less.
4. You have multiple redirects. I choose "ebooks" and it was first 302'd, then 301'd, then it landed on the final page.
Your site needs proper SEO attention from a professional. If you understand enough about your site's software and HTML to make these changes, you can tackle them on your own as well, but there is a lot of work to do.
It sounds like you have taken many positive steps towards improving your site. The question is, how well have you executed on these steps?
We wrote over 300 articles on our site
What kind of articles? Did you write about a topic of interest to you? Or did you research keywords to determine which phrases were generating the most traffic and write about those topics?
What quality of articles? Were they your average "lets write a 500 word article to gain traffic for my site" kind of average internet articles? Or did you set out to write "best on the web" type of content? Do you use anchor text in your content to link to various pages of your site from within each article? Have you used any form of evaluation tool on the article such as Paper Rater?
You can write 5 "best on the web" articles which get links from authoritative sources and generate more value then 300 of what I call "internet-filler" articles.
We got on Facebook and started posting regularly.
How popular is your facebook page? Are you actively encouraging users to Like your fb page? Many sites encourage readers to like the website page, but not their facebook page. Liking the fb page is much more powerful as you establish a link with your readers.
We got on Twitter and tweeted several times a week both about our niche topic and our store
That's great, but how many people follow you on Twitter? Do you tweet to general hash tags and @ addresses which are viewed by many people with a related interest in your site?
The ideas expressed above apply to your product reviews, blogs and YouTube videos as well. Creating content is easy. Creating high quality content that readers WANT to find and will gladly link to, that is the challenge. It's like the difference between going to school to get a passing grade, or going to school with the desire to graduate top of your class. You have competition and you need to step up to beat them.
You can use the SEOmoz crawl tool anytime.
If the site is under your control, Google Webmaster Tools would be ideal.
If the site is not under your control, Alexa is a tool to measure traffic. The higher ranked the site, the more details are captured. If a site is not ranked in the top 1 million for website traffic, there is not a great deal of detail offered.
How does a canonical work? Does the robot read the canonical and immediately go to the canonical URL or does it continue to read past the canonical tag and get to the no index, follow tag if there is one present?
The first thing to understand is the canonical tag is a suggestion, not an order. While a search engine will usually honor the canonical tag, there are instances where Google or other SEs may determine the canonical tag is not being used correctly so they disregard the canonical tag. Based on this understanding, yes the robot will read the entire page regardless of the canonical tag status.
Is it necessary to have both a canonical tag and no index, follow tag in place? Or should the canonical tag be sufficient to avoid duplicate content?
The two tags you mention conflict. You would never use both tags on the same page.
Noindex means you do not wish the page to appear in the search index. The canonical tag means you do wish the content to be included in the search index, but use the canonical URL in the index.
if both a canonical tag and no index, follow tag are in place, should they be in a specific order?
The order of meta tags does not matter. If a page was marked with both a canonical tag and a noindex tag, the noindex tag would take effect and the page would not be indexed, so the canonical tag would not have any effect.
In short, you want to use the canonical tag to resolve duplicate content issues, not the noindex tag.
I am not aware of any solid tool that provides this information. You may find a tool which estimates or otherwise provides a link value, but the challenge is that guesses are being stacked upon other guesses.
If someone responds "yes, try the Link Valuation Tool from Company X" my questions would be:
What metric is being used to value the link? PA? DA? PR? If PA/DA are being used, then those metrics are limited by the Linkscape crawler and the various factors concerning it's use (i.e. 1-2 months behind, issues mentioned by Carin, etc). If PR is being used, then the tool's PR is a guess and may be quite different from Google's PR
How is decay being handled? Is the PA/DA/PR being fully distributed? Or is the natural decay being calculated, and if so how? It's another guess factor.
How is the weighting of links being handled? The SEO consensus is that links in content are given more weight then links in footers and other site-wide links.
There are other factors such as multiple links to the same domain, multiple links to the same page, etc. I feel there are too many unknowns for a tool to provide a meaningful link valuation. I would love to be proven wrong. Such a tool would clearly offer great value to SEOs.
How fast can Domain Authority be established?
The answer to your question has a bit of complexity to it depending on the result you are seeking.
If you are referring to the DA you can actually see in your toolbar, that is based on the Linkscape crawl of the web which is updated approximately once per month. Depending on various timing factors you could obtain a new link today and it can take 2 months before you see the DA increase in the toolbar.
With respect to the actual benefits of DA (i.e. your site improving ranking results) those are obtained much faster. Google will crawl some sites multiple times each day and other pages only once per month depending on the site's DA, links and other factors. If you obtained a link from the New York Times today, then later today your site will receive the benefits of that link.
With respect to growing DA, a logarithmic scale is used. The higher your DA, the harder it is to improve the numbers.
Another point to consider is quality vs quantity. You can have 10k links but if they are all footer links from a single domain which has a relatively low DA, then you will not receive much benefit. If you have a single link from Time magazine, Harvard University or an authoritative site then you will receive a significant boost.
I know you are reaching for a specific answer, but it doesn't exist. It's kind of a "how many licks does it take to get to the center of a Tootsie Pop" type of question.
Hi AJ.
I was able to verify your site is indexed in google.co.uk. I disabled the flash on my browser and can see your html code appearing normally.
It looks like you have a WP site with both Yoast and the All in One SEO plugins. There are some conflicts you should resolve. Currently your page has two meta descriptions and three canonical tags. These extra tags are not the root cause of your current issue, but they should be cleaned up.
With your site indexed and your home page not showing up in SERPs for even targeted phrases this leads me to consider that your site may be under a penalty. I wish I could look at your site further but an issue just came up and I need to leave my office.
Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties?
Correct, as long as the sites are properly set up to target their target countries. Sites which are dedicated to a specific locale and language would not normally compete in SERPs with other sites that offer similar content in another country and language.
Does your company have that experience and do you provide such services?
While I appreciate the inquiry, my resources have been already dedicated for the remainder of this month. You could take a look at the SEOmoz directory. Please note that anyone can list their company in the directory. A listing is not an endorsement.
If you desire a further recommendation you can send me a message on SEOmoz and I will respond. I can share a few names of SEOs whom I have confidence in based on their Q&A responses, blogs and reputation if that would be helpful.
@Devin, I love your ideas. Some inspiring creativity!
@The Search Guys, does this site have any content which others would WANT to link to? If the answer is yes, then leverage that content. If the answer is no, then someone, somehow needs to create that content.
Lead generation is an extremely social business. I have difficulty understanding the desire to avoid social media with this type of business. If I accept the "no social media" rule, there needs to be regularly generated articles with exceptional content which shares knowledge and offers advice that people will want to know. Then it would be your job to ensure people do know about the content and earn the links.
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
Think of Google as an intelligent business. They have processes which algorithmically penalize websites. They also have systems which flag sites for manual review. When a penalty is deemed appropriate it is possible for it to be applied on any number of factors such as an IP address, a Google account, a domain, etc. It depends on how widespread of a violation has occurred.
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
You mentioned a few points which can potentially lead to a penalty. I am not clear from your post, but is sounds like you may be linking to casino and gambling sites. While those sites may be legitimate, many have a reputation for using black hat SEO techniques.
If you want to remove a penalty, be certain that you do not provide a followed link to any questionable site. When you provide a followed link to a site, you are basically saying "I trust this site. It is a good site and I endorse it". If you are found to offer a link to a "bad" site, your site can be penalized.
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
Hire a professional SEO to review your site. You want to review every page to ensure your site is within Google's guidelines. I am highly concerned about your site's links to external sites. I am also concerned about the automated link building that your current SEO has been doing. A professional SEO company should not lead your site to incur a penalty. I am having difficulty understanding how this happened in the first place, how it has not been fixed in almost a year, and how this SEO company is building links for you. Frankly, it's time to consider a new SEO company.
Translating content to other languages is fine. You can take the exact same article and offer a translated version for each language, and even country. For example you can offer a Spanish version for your Spain site, and a different Spanish version for your Mexico site. As long as these sites are targeting specific countries then there is no duplicate content issues.
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
The penalty would follow to your new domain.
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
Not good at all.
Summary: your site needs careful, professional review by a SEO professional who adheres to white hat techniques. Every day your site is penalized you are losing traffic and money. The cost you pay to fix this issue may be extremely small in comparison to the amount of revenue you have lost.
If this was my client I would leave the content as-is. There will be no benefit to moving the content other then a cleaner URL. Yes the cleaner URL is preferred, but based on the backlink situation you described, it is not worth the move.
If you were to undergo a CMS change or other situation which required a URL change, it would make sense to move the content at that time. If you didn't have many backlinks, or if you only had a small amount of content, it could also make sense to make the move.
If you have a blog with one quality article, I would move it to the main site if the article's topic was in alignment with the main site. Any articles moved should be 301'd to the main site. Any backlinks which you have influence over should be updated to point directly to the article on the main site.
By merging the blog into the main site, the DA of both can increase and your content will have an opportunity to rank better in SERPs.
Should i worry about this or will simply the otherone overcome it in time ?
You have properly 301 redirected the non-www version of your page to the www version so it will fix itself in time. Your site has a very low DA so Google may only visit you once/month.
If you would like to resolve the issue faster you can log into Google Webmaster Tools and set the preferred version of your site as "www". It is not necessary to make this change if you properly redirected your site, but the results would update faster.
The only thing internal site links can realistically do is adjust how PR flows within your own site. If you wanted to stretch for a corner case, you could have an island page or deep page that was not indexed or seen previously, but since you created a link to that page then it is seen. If that page had issues with it's content or links, then a penalty could be involved.
Based on your replies and the small amount of ranking drop, I really don't think a penalty is involved. The maximum it would be is a discounting of links.
If not do you know of any alhorithm changes in the past week?
With around 500 algorithm changes each year, there seem to be changes every week. I am not aware of any major changes but minor ones happen all the time.
The advantage of guest blogging is you are exposing your site to an established audience.
If you take any given topic there are going to be power users who have may receive RSS feeds from all authoritative sources and visit many sites; however, most people will have one or two favorite sites and stick with them. By adding an occasional guest blog to a site, you are presenting yourself to the site's audience. Those who find your article helpful or interesting may follow a link back to your site. That is what I consider to be the primary advantage of guest blogging. If that link is followed, then you have also diversified your link portfolio a bit as well.
There is no apparent reason for the drop. Based on the information you shared the only reasonable explanation is Google made a change to their algorithm which benefited the other sites, or acted to your disadvantage. Another less likely possibility is two or more sites made SEO improvements to their sites.
I can only suggest you continue building links and perform normal SEO improvements for your site.
I see your site as #7 for "trampoline pads" and #4 for "trampoline pad" in Google.co.uk SERPs.
Google can recognize your new links very quickly. The exact time is dependent upon the popularity of the site and pages from which you acquired the links. We wont be able to see the links in OSE for 1-2 months. Is there anything questionable in any way about these new links which could have drawn a penalty? Are any of these reciprocal links?
Your hope page seems to be executing many of the basics of on-page SEO quite well. Looking at your older links it appears all the followed links use the identical anchor text. I would recommend varying the text a bit otherwise it appears quite unnatural.
Is this an effect of the link building and will it bounce back in?
Rankings do not drop after link building unless you did something to incur a penalty. I would not expect your site to bounce back unless you resolve an issue for which you were penalized.
At a quick glance, you are #4 for "trampoline pad" and the three results ahead of you are ebay's two listings and another site with a trampoline page. All the sites have a higher DA then your site, and the PA of the #1 site is significantly higher then yours. The rankings don't seem unreasonable at all.