If the new website is really new I suggest testing to see what happens when you redirect from the older site. If the older site pages dont have rankings and or any authority, I wouldnt bother either way.
Posts made by rishil
-
RE: Redirect issue launching duplicate product categories on another TLD
-
RE: Google Manual Action Disappear
I have filled many recons once for a domain without losing the button. If your manual penalty check shows no manual actions, I wouldnt be worried. I havent heard of them doing that, and I have dealt with a lot of penalties.
Like I said - keep checking. If they WERE annoyed with your multiple submissions, give it 2 weeks - the normal timeframe that they ask you to wait before resubmitting a reinclusion request.
-
RE: Inbound links acquired link to some domain prior to redirecting back to us - is it worthy ??
That answers a fair bit.
Those posts are probably paid for and or the agency wants to simply track the traffic it drives through those links- those are affiliate tracking parameters with SEO friendly redirects - and probably a risk to your business in my opinion they way they are set up because affiliate type links are kind of against google guidelines... You can see some affiliate sites offering the same solution here: http://www.postaffiliatepro.com/features/affiliate-link-styles/
I would ask for them to be changed to normal links if it was my business.
-
RE: How to Interpret Rankings Decline
Have you manually verified drop in rankings where visibility is shown down in AWR?
-
RE: Google indexing
Frankly i wouldnt worry about it. Unless your journal entries are really poor and not unique in any way, I would just carry on. If you ARE really concerned, you can noindex / follow them or better still, move your journal to an external blogging platform if it doesnt fit the theme of the site, something like Tumblr might work well for you by the sounds of it.
-
RE: No follow links on a blog
Tags are basically lists of posts with a common theme - some sites automatically create tags, but commonly bloggers would add these into the CMS.
Ideally, tags are kind of an extension of "categories" and often are similar to these in the way they are presented, but while posts appear on one or two categories, tagged posts can appear on many tag pages. As such tag pages could result in a lot of duplicate content on a blog, and for many years now I and others have been advocating noindexing tag pages (good SEO plugins would do this for you) and to no follow links to tags - which sounds like the type of implementation you have.
Unless you are 100% sure that your tagged pages dont create too much duplicate content, I would leave them nofollowed, as I have seen them do more harm than good.
-
RE: Inbound links acquired link to some domain prior to redirecting back to us - is it worthy ??
It turns out that the anchor text from those articles (<a href=""></a>) are linked to a subdomain (complex long text url) of that agency's site before redirecting to us.
Typically if they paid for those links, and they were done purely for PR purposes then this not a bad way to do it as long as the links are either no followed or pass through a no indexed tracking script (like this sounds like).
If those links WERE gained through legitimate pr and were for SEO, then this is fairly odd behaviour - BUT if the links arent nofollowed and the redirect from their tracked URLs are through a legitimate 301 redirect and not through a noindex/nofollow URL, then you are likely getting some benefit.
I am put off a little bit now and want to know if that was a rip off. Could anyone tell me that if those links are worthy for SEO for us?? What should I do now?
Without seeing the URLs its hard to say, but sounds like you had those links built for SEO reasons, in which case I would find the current method fairly odd to say the least. Like I said, there is a way these can pass authority, but need to check the exact URLs to determine. However I would suggest asking them the question first and maybe posting their response here?
-
RE: Google Manual Action Disappear
I have seen this ONCE only. and at that time their penalty was revoked without a message. Keep an eye on the traffic and see if it increases, and periodically check the manual actions checker.
-
RE: How to Interpret Rankings Decline
Have you broken down your analytics by the same regions?
Do you have GWMT set up? If you do, then you can also check GWMT impressions by region to see if there is a real dip.
There are other tools in the market that may look at your visibility better, but without looking at the site, there could be a host of situations.
-
RE: URL Help
www.example.com/key2=value2&key=value1
www.example.com/key=value1& key2=value2As long as the URLs can be reached with a status 200 on either permutation, then it would be seen as a distinct URL.
From an SEO standpoint, depends on the semantic set up and content variation.
-
RE: Google reconsideration request processed - but same story.
Any chance you can screenshot and redact the domain - I am not sure that this is being read right...
-
RE: Quickview popup duplicate content
Disallow in robots would work - especially if the JS is actually rendering a pop up PAGE not a snippet existing on the page. If possible, I would prefer the use of Rel Canonical on those pop up pages - Google is funny sometimes and indexes URLs if you dissallow them in robots.
-
RE: Multiple 301 redirects for a HTTPS URL. Good or bad?
My personal rule of thumb - as few redirect jumps as possible. Three main reasons:
1. User journey + Browsers - Sometimes when there are too many redirects taking place, some browsers find it difficult to follow through and would simply not load the page. Also, even if there were only 2-3, the browser may load, but users on slower connections may find it tiresome waiting for content to load.
2. As ThompsonPaul highlights, you COULD lose some link value due to dilution through 301 redirects.
3. Multiple 301 redirects are often used by spammers and I foresee in the near future these causing a lot of ranking headaches. The older the site, the longer the chain might end up - for example, imagine you had a product at:
https://domain.com/product1
Links to that page exist at domain.com/product1The journey would be: domain.com/product1 >http://domain.com/product1 > https://domain.com/product1
Now imagine a year down the line, product 1 is discontinued and you decide to redirect https://domain.com/product1 to domain.com/product2
Imagine your journey now:
domain.com/product1 >http://domain.com/product1 > https://domain.com/product1 > domain.com/product2 >http://domain.com/product2 > https://domain.com/product2
This could carry on indefinitely in the lifetime of the site...
Best solution: Decide what version of the site you want to use and simply try and use only one redirect, not a chain. Periodically check for chained redirects and resolve as you go along. (I try and do this bi annually).
-
RE: Parent pages and seo
Frankly depends on the depth you want to take and your CMS and its automation features.
So take for example your page : www.mysite.com/dog-training/whistle-training/
An automated title tag and meta description set up would take its data from the URL and Page titles and categories:
Whistle Training for Dogs > Dog Training Guides > MySite.com
Whereby you have the rules:
{Page Title} for Dogs > {category URL} Guides > Mysite.com
In such instances it is better to use clever set ups that work with most keyphrase combinations, and the added "Dog Training" in the title, URL and page description helps strengthen the overall site or section focus when it comes to the primary keyword such as "Dog training"
So in the example Richard has given above:
Title: Whistle Training for Dogs > Dog Training Guides > MySite.com
Title: Triball Training for Dogs > Dog Training Guides > MySite.com
Title: Fetch Training for Dogs > Dog Training Guides > MySite.com
This shows that you have plenty of "Dog Training" pages in that category and you can start to see how the repetition would help the section on the site.
However, if you DONT have automation, I prefer a flatter hiearchy:
You can still have the same title tag set up to strengthen the relevancy.
The reason why I prefer the flat URL hiearchy is that in your case the flowed hiearchy works, but on many sites a page could be part of two categories, and as such can cause havoc or multiple URLs when trying to fit into URL.
-
RE: Product titles
A clever design team can help neaten up the look, but I would try better descriptive text for product titles if possible without going overboard and keyword stuffing.
-
RE: Is there a Penguin Time Limit
No it doesn't. You need to disavow and remove as many bad links. Unlike a manual penalty, which has a time limit reset factor, penguin is algorithmic, so the filter will keep your site down if you don't do anything about it.
-
RE: Significant organic traffic increase from outside of my service area
If you switch the page then there is a slight risk, although you are doing it for the right reason. One of the better and safer ways of doing this is to serve an image or a small block of text that directs users to the better matched section, and only geo serving that content on the page, not the whole page.
-
RE: April Google Update?
So here is your issue
run a review, start removing and disavowing all the bad links. A lot of penalties went out in April, e which is why Mozcast was so volatike, there wasn't a major update IMHO.
-
RE: Google Places/Affiliate/Partner Site
If you decide to go down the route of creating a fictitious address, you may find hard to get a real one approved if you get caught out. They are getting stricter. In the past you could use a mailbox if you didnt have a real place
-
RE: April Google Update?
Have you checked the penalty check tool in webmaster tools?
-
RE: SEO: .com vs .org vs .travel Domain
Personally I prefer the .com versions as they van be future proofed. It's hard to rank a dot travel domain. Dot org rank ok, but I think loose their branding value. If you are building a long term pure white hat strategy, I would focus on a decent dot com. And register the front org and dot net versions too.
-
RE: Product titles
DoDpi you mean on page product titles?
it may look nicer, but you loose keyword real estate. Where possible, I would still look to use keywords.
-
RE: Alternative Link Detox tools?
If you are looking specifically for link analysis tools then a pretty good alternative is http://linkrisk.com/
I have managed to get many penalties overturned based solely on using them as an analysis tool.
-
RE: Local site under generic domain
You could get the original CCTLD back up with a single page asking people to visit the .com - linking to the .com version with a nofollow instead of the 302.
I have seen penalties pass through redirects and ideally I would try and not redirect a penalised domain to your main site without any in between padding to halt the flow of the penalty.
-
RE: Redirect issue launching duplicate product categories on another TLD
So my question is, should we redirect all the old product categories that we are shutting down to the new website on another TLD where we are opening them again and the same for the products (e.g. superstar.dk/garmin -> xxx.com/garmin)? Or would it be better to keep the redirects within the same website/TLD (e.g. superstar.dk/garmin -> superstar.dk)?
Historically my answer would have been straightforward. Redirect to the new site. However with the changing rules in search engines I would advise caution. Does your new website have stable rankings? Is it worth the risk of 301's from another site?
My opinion is use of 301s for ranking cross domains may be a tactic that borders on the grey area and could lead to a penalty in the future.
-
RE: How can it be possible
Looks like its part of a dropped domain network - if you look through the historic, you will see the trend of its backlinks. Its not a particularly difficult keyword to rank for to be honest. I think 10-15 decent backlinks would get you there.
-
RE: Duplicate exact match domains flagged by google - need help reinclusion
They have been attacked time and time again - its not happened yet, and MC keeps saying they are looking at it. Quite a few took a bump when I put that comment out, but weeks later most were back in the index.
-
RE: Google changing case of URLs in SERPs?
two questions: do the lowercase URLs work? (i.e have you set up rules to allow them on your system?)
Could you ping me an example? This is a policy of google Adwords, however I have yetto com across one in SERPs...
-
RE: How much pain can I expect if I change the URL structure of the site again?
If you are serioulsy looking at changing CMS - then wait till you are closer to implementation - and then run it all in one go...
-
RE: 301 redirect while keeping OLD domain for branding
A rel canon will solve the duplicate content issue - telling Google that the "real" and "original" content sites on A not B.
-
RE: 301 redirect while keeping OLD domain for branding
Read up on rel canonical - that will help with:
Keeping the old site live and giving direct load users the same experience instead of being redirected elsewhere, and transfer all the inbound link juice to the main site... (also takes care of cross domain duplicates)
Thats just one option of course...
-
RE: What is the full User Agent of Rogerbot?
Best way to get that info is to hit up Site Support - via http://www.seomoz.org/dp/rogerbot
-
RE: If Google turns down the weight of keywords in domains then what will they be turning up?
I dont think that they will specifically turn anything "up". however the things to look out for in terms of increasing signal to ranking factrs are the building of "Brand Perception" and Social Signals...
-
RE: Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I always advise people NOT to use the robots txt to block off pages - it isnt the best way to handle things. In your case, there may be two options that you can consider:
1. For variant pages, (multiple parameters of the same page) use the rel canonical to increase the strength of the original page, and to keep the variants out of the index.
2. A controversial one this, and many may disagree, but depends on situation basis - allow crawling of the page, but dont allow indexing - follow, no index, which would still pass any juice, but wont index pages that you dont want in the SERPs. I normally do this for Search Result Pages that get indexed...
-
RE: 28,000 links - How to analyse sensibly
extract, and use excel - pivot tables are your friends http://www.powerpivot.com/
-
RE: Tips For Video Optimization
The best guide out there IMHO is by Yoast http://yoast.com/video-seo/
-
RE: Robots.txt File Redirects to Home Page
1. I wouldnt advise redirecting the robots.txt to redirect to home page. It seems that they hve a dynamic 404 redirect system - which when a URL doesnt exist the site redirects it to home. There are god and bad points about this strategy, hoever I would prefer NOT to do it.
2. Re getting site indexed - no it wouldnt hurt them, but would give you much less control over the robots directive, in case you want to add custom instructions. If Google crawlers cant get to it (as in its not user agent cloaked to allow the google bot) you will not be able to do so (eg excluding pages from being indexed via robots wont be ossible).
-
RE: 40,000 High Value Links - Sold?
DM me and lets talk privately - dont want to out anyone in a public forum, which I am glad you didnt. But let me give you a few insights - seems like the co you are talking about is a UK one by the way...
-
RE: When did the New Google Algorithm Come into Force in the UK
Hi Mathew, a bunch of UK SEOs are keeping an eye out on the UK Roll out of the Panda update - however there isnt any hard evidence to prove that it has been rolled out. Infact, since Mahalo still ranks in the UK, I would say that itdef hasnt been rolled out. I would look beyond the algo change to investigate your traffic loss, UNLESS your traffic loss was from outside the UK?
Have you run an analytics segment analysis where your traffic is from normally and if its lost out regionally?
-
RE: Designed new website with new domain has more than 24 million links over night on seomoz, how has this happend?
That def seems to be an issue - as daft as this may sound - could you double check the URL?
If thats OK - try creating a new campaign and see if that replicates - if it does, it may be a bug with the campaign tool and one for the moz staff to look into.
-
RE: How to get a quick idea of competition for large numbers of keywords
This is my personal opinion, and others may disagree, but I find the more cometition on a PPC KW, the harder it is to rank in organic. The other piece of info you get is that the more competition, the better the potential return, as most competitrs are honing into their profitable KWs more.
-
RE: Paul and Angela Style Backlinks
Totally depends on the niche I am afraid. however for small businesses, I always advise takingthe manual route - although old, the strategies here still work:
http://www.seomoz.org/blog/small-business-link-building-part-a-analysing-opportunities
http://www.seomoz.org/ugc/small-business-link-building-part-b-grabbing-the-bull-by-the-horns
-
RE: Designed new website with new domain has more than 24 million links over night on seomoz, how has this happend?
Could you add a screenshot - I may be being thick - but I still dont get where you areseeing info - a screenshot may help a bit
-
RE: How to get a quick idea of competition for large numbers of keywords
SEM Rush is a pretty good tool outside the SEOmoz and google toolset...
Alternatively, run a PPC camaign for a day or two on the full set - will probably give you the most accurate data, instantly...
-
RE: How do you find out what (and when) new websites are linking to your site?
If you are using WP then its fairly easy, as you get inbound pings.
However, for non blogging platforms, its usually a good idea to set up a referring sites filter in your analytics, you should be able to spot referrals from sites that link to you, even if it is just a single referral... Doesnt cat them all, but does catch a fair few...
-
RE: Google said that low-quality pages on your site may affect rankings on other parts
Are you syndicating the content for link building? Or are scrapers just pulling in your content?
On the note of tags and archives, in most cases this is best practice anyway. However it seems you may have been hit with the content udate, and as Shailendra suggests, you may want to get webmaster support people to look into your site.
Without a full analysis it is difficult to say what else is affecting your site.
-
RE: Index forum sites
To start with if the forum ISNT indexed, get the URLS changed, and try and get all the titles customosed so they arent generic across every thread.
I suggest a subdomain for forums, as you point out, it is seen as a separate site. you can always flow strenght from the forum into the main TLD by cross linking, which I think is much better to do, however some may disagree with this.
If I were in your position, I would prefer to release section by section - a massive blast of content would be more of a flag then a slow release - this is my personal opinion.
Re threads - I would follow/noindex all threads tht arent on the first page of the post - that way, the juice follows and gets crawled. Adding relcanon to those may also help preserve the strength...
-
RE: Duplicate exact match domains flagged by google - need help reinclusion
Sorry - I had to giggle when I first read your question
1. Google has made changes to algo that is finding a number of these sites , especially ones with dupe content
2. They are in the process of downgrading the value of exact match domains.
My advice? If you are adamant n getting these sites back up, take one - customise the template, make sure its not linked to or or linking to any of your others - rewrite the content and make it unique, and then send in a reinclusion request to google, outlining what you did and WHY you did it...