Thanks allot Matt
Posts made by AndreVanKets
-
RE: Content Duplication for Job Posting
Perhaps Advertise the job opportunity on the "other" sites with a "enquire" or "find out more" button that links to the original job add on your website.
The button "view or apply to job" links back to the original advert.
As long as you are linking to the original source, you are safe.
Greg
-
RE: How does Google know if a backlink is good or not?
Relevance is important, but then so is the domain authority and trust.
A link from a news website for example. Its unrelated, but can provide a powerful boost for your websites "trust"
Your main objective in my opinion should be to get links on relevant websites, however you can mix the two by getting links from high authority, unrelated websites as well. There are a few guest bloggers on SEOMOZ who link to their companies websites totally unrelated to SEO,
lets say you already had a mix of relevant as well as unrelated high authority links and you had to choose between DA80 (unrelated) and DA40 (related)
I would go with the most powerful site, DA80
Greg
-
RE: Hard Lessons Learned... What's yours?
Bumbed!!
We uploaded over 150 new product pages, and after 2 weeks Google had still not indexed them. After a few hours of trying to figure it out, all the new pages had a canonical link to the parent page (it used the parent page template)
We removed the canonical tag, but we are still waiting!
Reminder to self: Check all technical stuffs when publishing new pages!
-
Site command / Footprint Question
Hi All,
I am looking for websites with keywords in the domain and I am using:
inurl:keyword/s
The results that come back include sub-pages and not only domains with the keywords in the root domain.
example of what i mean:
What I want displayed only:
www.keyword/s.com
Does anyone know of a site command i can use to display URL's with keywords in the root domain only?
Thanks in Advance
Greg
-
RE: Content Strategy - High Value on Lots of Sites vs. High Value on One Site
It really depends what your intentions are.
If you are look at it technically, 20 links from 20 different domains i believe will be more beneficial in terms of improving your link profile and ranks.
On the other hand, if the audience on that website is closely related and you think you'll get allot of referral, click through traffic from the 20 different articles, then that could be an option.
Adding up the pro's and cons mixed with the intent of your promotion will give you your answer.
If your looking at improving your link profile (ranks) then i would go 1 on 20 rather than 20 on 1.
Greg
-
RE: When are the Mozcon 2012 videos going to be released?
Agreed. I'm keen to watch the vids as well.
-
RE: Home page deindexed by google
You cant have both versions indexed in Google. www and non www are 2 separate pages and seen as duplicate content if you use both.
As Google has already indexed the non www version, then I would redirect all your pages with www to pages without www to avoid duplicate content.
-
RE: Wordpress.com content feeding into site's subdomain, who gets SEO credit?
My understanding is this:
If its a duplicate, as in a copy and paste article, then Google will eventually de-index the duplication and keep the original article.
In your clients case though, they are providing the source links, so google doesnt label it as duplicate content, but sees it as syndicated content.
Look at news sources for example. The same article syndicated on multiple sites are all indexed and stay indexed. (this is the case for your clients site)
What i would tell your client is that fresh and unique content on their site is key for SEO. By syndicating articles, it doesn't provide any benefits for SEO in terms of unique and fresh content, so the operation is pointless unless its for user experience only.
Give them an example, say its the same as giving away articles to other websites, and then reusing them on their site as "second hand" articles. Just because its word press doesn't mean its any different to any other website out there.
Good luck!
Greg
-
Hcards - Business or Personal?
Hi All,
To my knowledge, Hcards for rich snippets are generally used by bloggers etc.
I would like to use the business logo rather than a photo of myself, as well as the business twitter, G+ local profiles and Bio for the business.
The Author page will be the "about us" page with info about the company.
Is this OK? Or does it have to be personal?
-
RE: Duplicate Content
You should only no-follow your tags and archives and not your categories...
In the plugin settings, under permalinks, there is an option
"Strip the category base (usually
/category/
) from the category URL." this will just stop the duplicate pages from appearing,Blocking the category's must have caused the drop.
Greg
-
RE: Duplicate Content
Well, the duplicate content is causing issues alone.. Google does not like duplicate pages at all...
If you select which are your primary pages, and tell google to ignore the rest, it can only help your ranking.
With the Yoast SEO plugin, all you need to do is set tags to no-follow and no-index, and also strip the category from the URL. (it redirects automatically, as well)
Greg
-
RE: Cross-linking domains dominate SERP?
I wasn't implying that he should make a network of sites, i meant the links he gets should be better. - My bad.
-
RE: Cross-linking domains dominate SERP?
Just to add to that, These websites have shown you their strategy, instead of admiring them, replicate what they have done, but do it even better.
Original content like EGOL suggests, and even more relevant and stronger links, no doubt you'll be a strong competitor.
Greg
-
RE: Duplicate Content
Wordpress does this when you use tags....
Essentially the tags display the exact content as the original URL so the pages are identical but the URL is different.
2 Options that i can think of.
1.) Remove the tags and strip the category segment in the URL and stop using them in future. This will require redirects from duplicate URL"s to the main article (this will take planning, allot of time and is quite complicated)
2.) If you want the Tags and Categories for user experience, Install Yoast SEO plugin which allows you to insert a canonical URL on the duplicate category pages. This tells Google were the original page can be found. Tags are only their for user experience so you can set these to no-follow and no-index.
Greg
-
RE: How to rank for difficult terms
You could optimise the website for the location in UK that the business is situated in. This will give it some priority over other websites if you search in the same location as the business.
Do some research on Local SEO and make the most of it for your client.
Greg
-
RE: Quick question about country specific organic results
Yes.
Since the venice update, Google is putting preferences on businesses located in the area that people are searching.. Do some research here on SEOMOZ for local SEO for info on how to get local sites properly optimized.
http://www.seomoz.org/blog/local-seo-checklist-for-new-sites-whiteboard-friday
http://www.seomoz.org/blog/the-basics-of-local-seo-whiteboard-friday
Greg
-
RE: Redirection - Seo trick?
People buy expired domains and redirect them to their website to pass on the link juice and PR..
Its a grey hat technique which i don't have any experience with myself, but I do know that this is what people do.
There is a PR7 website that i found with only 1 link to it and I assume this was accomplished by redirecting authoritative domains to his new website.
My 2c
Greg
-
RE: Satellite Sites ?
Yes they can, Its just more expensive to host on different servers
-
RE: Web 2.0 seo
Web2.0 sites are the lowest value links you can get imo and a waste of time.
remember
site.wordpress.com and website.wordpress.com are both on the same website, that being Wordpress.
No matter how many links you get on these blog sites, they still only count as 1 linking root domain which doesnt help SEO very much.
Dont waste you time spinning articles to submit on web.20 sites (Senuke X blasts etc) Rather spend your time creating unique articles and publishing them on different REAL websites. This is the difference between "Building" links and "earning links" (which do you think Google prefers)?
Greg
-
RE: Will our PA be retained after URL updates?
Sorting out the canonicalisation was a good move, and generally url's with lower case characters are the best practices, but there is no negative SEO involved with letters in Caps in the URL. (lets hope the PA wasn't to high, as the new URL's are essentially new pages)
The link juice will flow via the redirect, but i dont think Open Site Explorer will follow the redirect as a link that contrbutes to PA.
Perhaps ask your SEO to change the URL's on links that have been built to these pages. This will definitely bring your PA for these pages back up. '
Greg
-
RE: Changing URL's for a website redesign
Will the URL be the only change on the page?
If so, in the case of redirecting www.website.com/page1.html to www.website.com/page1/ then a simple redirect from URL A to URL B will be fine.
You may see fluctuations in rankings for a while, but it should settle down close to were you were previously.
If you are changing keywords in the URL, then you might see more fluctuation in ranking.
Also, spend some time contacting webmasters that are linking to your old URL and ask them to point to the new one. Google passes link juice via redirects, but it would be worth your wile to get as many links pointing directly to your pages without having to redirect.
Greg
-
RE: Robots.txt issue - site resubmission needed?
I agree with Michael.
I have also seen a wordpress site that had blocked the robots from the entire site for 1 week.
After allowing the robots back in, we saw the rankings improve with in a few days.
Don't stress, just resubmit the sitemap or create a new one with the effected URL's
Greg
-
RE: De-indexed Link Directory
I would email each webmaster and save the corrospondance so that you can atleast prove to Google that you have tried to remove the links when you send your next Reconsideration request.
We wernt hit by any penalties, but after running the detox tool, we found a few sites that had been deinexed and got a decent response rate when asking for the link to be removed
If we ever get hit by a dodgy link penalty, we can at least provide some proof that we tried to clean up our link profile.
Greg
-
RE: Thinking about deindexing 200,000 pages
If the pages are there for user experience only, and you dont expect any of these pages to rank, I would block google bot from scanning the pages/categories you want to remove using the robots.txt, as well as making the headers on all these pages no-index (for good measure)
Once you have set this up, you can request the url's to be removed in your G-WMT account.
We had loads of booking pages and media directories indexed, and this is what we did to get them out of the index. Not sure how it improves ranks, but it definitely tells Google which pages are most important preventing Google resources being spent on pages that are not important.
Hope that helps?
Greg
-
RE: Do shady backlinks actually damage ranking?
Its all speculation at the moment, so the fear mongering is rampent.
In short, yes, bad backlinks can have a negative effect on your site, but it depends on how well established your site is. Negative SEO is real, but depending on the severity of the dodgy links, google just discredits them and asks you to get rid of them, rather than slapping you with a penalty.
When you say, you are afraid of "going out there" what is is that you are afraid of? Just don't be manipulative, get links on relevant websites and you have nothing to worry about. Just forget about trying to manipulate rankings, build relationships with other webmasters, after all, 1 great link is better than 50 suspect / low value links that take hours to "build"
-
RE: Exact Match Domain + shorter permalink vs. longer permalink?
www.acupunctureintakeform.com/template/ is a much better idea.
There will be no benefit in terms of SEO for you to repeat the keyword in the URL
As for the future pages, you could always do the following
www.acupunctureintakeform.com/template/
www.acupunctureintakeform.com/template-with-diagrame/
www.acupunctureintakeform.com/template-with-pictures/
etc
Hope that helps.
Greg
-
RE: Can someone tell me what ∞% trending upward in my keyword report means?
The change % represents the difference in visibility for the keyword you have tracked in the report.
-
RE: DA/PA against PR
I have come across this many times.
Have a look at this one.
www.becitywise.com/ PR7 DA6!
I tend to trust Open Site Explorer much more when comparing the two.
Greg
-
RE: 301 redirect www.brandname.com to www.brandname-keyword.com
How old is the existing www.xxxx.com website? How many pages are indexed?
If its a new site, with only a few pages indexed, and you are thinking of rather using www.xxx-toys.com then it will be a fairly hassle free process.
When your dealing with many pages, it can be a bit more time consuming and complicated,
Greg
-
RE: Similar sites on same IP address
Having each on a unique IP isnt neccessary.
The only real benefit of having your sites on seperate Class IPs is to break the relationship between them. In your case, sorting the duplicate content issue should be enough to sort out the penalty.
The only downfall IMO would be the low link juice value from linking between all 3 sites
Greg.
-
RE: Caps in URL creating duplicate content
www.url.com/abc and www.url.com/ABC are two completely different pages according to Google
I would redirect any and all pages with capitals to the corresponding lower case URL's.
Dont worry about the link juice as it will pass over via the redirect. It will also be much better than having 2 identical pages competing with eachother (according to Google)
Greg
-
RE: Duplicate Page Content Report
Are all 2000 pages 404's?
Do they all have unique URL's?
If all 2000 are 404's then Mozbot would pick these up as dupe content as well as 404's but if its only reporting duplicates and not all are 404's then other pages would also be duplicates.
1.) Confirm all duiplicate pages are 404's
2.) Do a scan using xenu link sleuth and see which pages are linking to 404's.
-
RE: Duplicate Page Content Report
Do you have any 404 errors in your report?
Usually 404 pages are seen as duplicates, so when fixing pages linking to 404 pages, it will fix the duplicate issues as well..
Greg
-
RE: Merchant´s data feed for affiliates is the same content as their own website...
It would be great if their was a plugin or Feed setting that automatically added the canonical tag for you, but im not sure about that.
If they are scraping the content via feeds, then you could include a link to the same page in the content.
Create a link in the content linking to the page URL so that when people scrape the content, the link will still be on other websites, linking to the original source. (do this creatively, perhaps hyperlink the page title in the article)
Your site wont get any penalties as long as your site was indexed first.
Greg
-
RE: Merchant´s data feed for affiliates is the same content as their own website...
Hi
You have 2 options.
1.) Add a rel=canonical tag on their page with the URL to the original content on your website.
2.) Link to the original content via a URL or text link on their pages
From Matt Cutts:
_We've had a lot of interest in these meta tags, particularly in how the syndication-source tag relates to rel=canonical. After evaluating this feedback, we’ve updated our system to use rel=canonical instead of syndication-source, if both are specified. _
If you know the full URL, rel=canonical is preferred, and you need not specify syndication-source.
If you know a partial URL, or just the domain name, continue using syndication-source.
We've also had people ask "why metatag instead of linktag"? We actually support both forms for the tag, and you can use either. However, we believe the linktag form is more in line with the spirit of the standard, and encourage new users to implement the linktag form rather than the metatag form we originally proposed.
Greg
-
RE: Great Content
In layman's terms, yes, useful great content is all you need to get links naturally.
That's how search engines work, they expect that if people like a website or an article, they will link to it. If you can satisfy your visitors by giving them what they want, they will link/like/share your content. (creating brand exposure, more links)
Obviously its not as simple as posting a "great" article on your site and then forgetting about everything else.
You still need to promote the article in social channels/blog comments etc etc to spread the word.
In time, when your website/brand grows in authority (by building relationships with related webmasters/bloggers etc) Google will rank your articles without you having to do anything as your domain has the authority and trust to back it up.
Publish your great content, but then also promote/share it among the influential people in your niche and over time, the links will come.
Just my opinion on the matter.
Greg
-
RE: Not able to find Do follow Link as shown in Seomoz Toolbar
There are 3 external links on the example page.
2 are the FB and G+ links
The other is the question mark in the captcha box pointing to Google.com. Its followed, and external.
I hope this solves the mystery?
Regards
Greg
-
Page Rank 7 - Domain Authority of 6 (How does this happen)
Iv never seen this before, but what do you guys make of this?
According to Google, its a PR7 website.
SEOMOZ are reporting a DA of 6 (1 link to the domain)
http://www.prchecker.info/check_page_rank.php
Any thoughts as to how or why Google have assigned such a high PR?
According to Whois, the domain was registered on 16/07/12 so its brand new...
The funny thing is, many people purchase links based on the Page Rank, so this guy must be making a fortune. (the Friends links I assume are all paid links)
Greg
-
RE: Has my 301 home page really worked?
Hi,
I can confirm that your 301 redirect is not working..
However, because http://www.davidclick.com/index.htm has a rel="canonical" tag pointing to http://www.davidclick.com/ its not causing any duplication issues.
However, I would recommend you get the index page redirected to the correct home page anyway.
Regards
Greg
-
RE: Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
If every single page was set to 301 back to your home page and subsequently Google has removed all other URLs from the index, then disable the 301's, make sure the pages are working, and resend the sitemap to Google.
Do a scan with link sleuth to confirm all your pages are working correctly.
Greg
-
RE: Manual Penalty Removed - Recovery Times...
Great news!
As they said, give it a month or two and you'll start seeing your rankings improve.
They probably apply these penalties in bulk and then manually review when reconsideration requests come through. Whoever analysed your profile saw nothing majorly wrong so lifted the ban.
My 2c worth.
-
RE: Duplicate Content - That Old Chestnut!!!
Hi Craig,
Search this in Google "keyword"+“Guest bloggers wanted” OR “guest blogger wanted”
Then analyse the authority of each site, and choose the best 10 to submit your articles to.
You could also join the guest posting community at myblogguest.com
Good luck!
Greg
-
RE: Duplicate Content - That Old Chestnut!!!
Do both.
Google wants to see fresh content on you site, but then you also need contextual backlinks from other sites.
Do 10 guest posts, and publish 10 on your website.
Greg
-
RE: Redirecting over-optimised pages
Hi,
How bad was the effect on rankings?
Is your objective to get both websites ranking?
If not, why not redirect the old site to the new one?
Redirecting the spammy/Dupe content to the clean pages wont be an issue in my opinion. (Bad link neighborhood would be a different story) but I'm not sure if this will help lift the penalty.
Perhaps someone else can comment on that?
Greg
-
RE: Website Hit by Penguin Update Rising Back From The Dead?
Link Research tools by Chris Cemper has just released a "link detox tool" that you can use once for a tweet.
It highlights the most suspect links (based on a number of factors) making it "quick and easy" to identify dodgy pages with links to your site
Our websites haven't been hit by any penalties, but there are over 100 suspect links reported that i will manually go through to confirm, and then send removal requests where necessary.
http://www.linkresearchtools.com/news/link-detox-clean-backlink-profile/
-
RE: Website Hit by Penguin Update Rising Back From The Dead?
What have you changed on the website other than remove the bad backlinks and get rid of the porn directory?
Did you send a reconsideration request to Google?
Have you been updating the site with new content?
Has the SEO company been building more links?
From what I have read, It looks like you are starting to recover.There is no way to know for sure unless you give it a month or 2. Just keep getting great back links with brand terms and check your progress in a month or two. If all looks good, i would say its safe to assume the penalty has been lifted..
-
RE: Holy Redirects
Its sounds like pages are on a redirect loop....
Run a scrawl using link sleuth and identify the URL's that are redirecting and you can take it from there.
Perhaps do the scan, and then report back with an example of why they are redirecting and we can be of better assistance.
Greg
-
RE: Could google recognize urls in data-attributes?
I loaded the link in a program that makes HTML code visible. (For HTML noobs like me) here's a link to the program http://www.pagebreeze.com/
If you edit the HTML Source with that link, and then view it, nothing is displayed..
I'm not that clued up with HTML, but from what i can tell, Google wont see that as a link.
Greg
-
RE: Could google recognize urls in data-attributes?
After testing that code in an HTML viewer, I would say no, Google wont see that as a link.
Could you give me a real example of the link?
Greg