Yep, I would go with a 301 also. Keep that juice (or as much as you can muster).. you'll lose a little value in the transfer.. (5-15%), but it would be worth keeping it. Cheers!
Best posts made by RobMay
-
RE: Is a 302 useful here?
-
RE: My Domain rank is falling but my traffic is improving?
Hi Greg,
This might currently be affected by any back-links you have either established (been building to the domain) or one's that have freely linked to you.
My advise, would be to continue just working to improve your site. Metrics for performance will increase over time, if you keep your 'visitors, and clients' in mind. Improve the user experience, take care of the proper technical steps to ensure proper use of 301's, on and off page optimizations, great content and user experience and over time, your metrics will show themselves off which is always nice..
Just remember, all this work isn't something that will happen overnight. It takes time to improve these rankings. As Moz's index is usually updated 1 time per month (if they are running on time), you will be able to track this metric month over month. If you see steady drops, then you might want to start digging.
Again, my guess is that you are on the right track. Don't overthink the metrics behind the site. Look at the analytics data and think, how can I better my site for improved user experience
Cheers, Rob
-
RE: Do videos count as duplicate content?
The only way it could be duplicate content is if you load the video to multiple sites (it's better to load to one location on your site and share the EMBED feed) that way you also get credit for the inbound links too, and if you transcribe the content of your video feed for the site and populate that across multiple other sites. If you just transcribe the content of the video and leave it as is, you'll be fine
-
RE: Can links from an old site raise DA for other site? Or just unethical?
You could use this strategy, if you wanted. There is nothing wrong with assimilating an x-competitors domain URL and turning that ownership into a site your company uses and 301 redirects (perhaps they had a strong brand following?). You might be able to leverage some of that related traffic and turn those visitors into customers. You will want to look at the DA/PA, but as well, the # and quality of the backlinks that were acquired during the process when they owned it. Make absolutely sure it's a clean URL, with clean related backlinks that aren't tied to bad areas, because that will funnel down through to your site via the 301 redirect, if you go this route.
I had this happen to a company/client site. They stole an expired domain from the main competitor who was still in business and re-appropriated it for their own use in PAY DAY LOANS (which had absolutely nothing to do with the original destination URL it was originally taken from). Then, that SPAM site/company went out of business for whatever reason, and I was tracking the domain to re-acquire it myself (through auction) and hopefully be able to re-use it for the main site. The site's profile will probably need a major backlink analysis to see if it has been corrupted by this company who bought it when it expired and used it for spam related work. Auugggh.
Sometimes, these types of strategies do work, but you have to carefully evaluate and plan the benefits vs the cons and cost. I'll leave it up to you, but that's my 2 cents
Cheers!
-
RE: Creating new website with possible Url change (301 involved?)
I'm assuming you are moving from an old static HTML site to something along the lines of a CMS with Drupal or Joomla based on your new URL structure example above.
Absolutely. If you don't 301 the old page URL's to the new locations and URL names - you will eventually lose all the back-link development you have been working on. Those old URL's will eventually return a 404 error in Google WebMaster Tools, and the link value and 'juice' will be lost..
Plan out an Excel spreadsheet and then work to map all the pages from your site to their counterpart NEW URL names. This way you will make sure to get all your pages mapped out.
You would probably also want to crawl your old domain (before the new site goes live) with a tool like 'Screaming Frog' or 'Xenu' which you can download online and is free (best part and great tools to have). This will help you find and extract all the pages in your site into Excel - ensuring you don't miss any in the mapping process.
I would schedule some time after launch, to double check each URL individually (with the old URL's from the Excel DOC from the crawl) when the site goes live, to verify that the proper page level 301's is in place and correctly working.
Hope this helps you out. You should be in good shape, if you follow these steps pre and post launch.
Rob
-
RE: How to redirect www vs. non-www in IIS
Hey KJ,
These are good read/resource to start :
http://www.mcanerin.com/en/articles/301-redirect-iis.asp
http://authoritylabs.com/blog/solving-canonical-problems/
Cheers, Rob
-
RE: New site not ranking for it's name
Hey Mark, exactly. Keep your KW target focused, and work in long tail options to gain a few positions on that level, while building up the brand site, level, content strategy etc.
I would use this targeted style on each and every page of the domain, to help with consistency as well. You can explore tougher keywords later as the domain gains some authority..
That should help you out and get the domain ranking for it's brand. The hyphenated domain will take longer tho, based on all my tests. You need to work in signals which authenticate that with the domain/brand.
Cheers!
-
RE: Is there a utility that can tell me what keywords my site already ranks high for?
HI Gene,
Once you figure out this list from the data gathering - you'll need a tool to build, and run against that list (or lists if broken out by channel), so you can show the client where they position in the ranks.
You can use a tool like the MOZ (rank tracking utility in PRO) although I find this tedious as you can't build a singular report as easily when compared to something, say like RavenTools (this is another tool my team uses for site rank audits) or AWR (Advanced Web Ranking)
You could also go about getting an independent software like AWR (yearly licenses which can be costly) which isn't cloud based, and I prefer this software over any other. The flexibility of this program is great, and the custom reports are fantastic for moulding as needed to export data in multiple formats. You can take it anywhere, and provided you have an internet connection, run it anytime you like.
Looking over the current KW list they rank for (also looking at Google Webmaster Tools, Bing Webmaster Tools (if you have this setup) is a good place to start Nice call Brian!
you might also want to take that list you extract, and build out an excel file using the MOZ 'keyword analysis and difficulty tool' to map our the competitiveness, and difficutlty score for each keyword, so you can organize them in terms of performance (and attribute that to either short and long tail KW traffic). You'll also extract some nice data through the API from Google on Exact and Broad match phrase search volume
Hope this helps. Cheers.
Rb
-
RE: Changing to a new Google Analytics Profile - Will I lose my history?
Moz should keep the old data in that existing account - but you won't be able to port it over to the new profile/domain tracking (and I don't think you would want too) if it's a new client. I'm guessing you build a 'campaign' for tracking in the PRO account level tools?
I'm not 100% sure on the integration of Moz' and if it can be ported over. I also suggest reaching out to the support teams with a quick email.
If they don't want to share the old account - get a new one going and track your efforts against that of the older metrics. In time - you should be able to run comparative data (manually of course) in Excel to show trending, jumps, bumps etc.
Cheers
-
RE: Competitor purchased thousands of hidden links to our website... will it hurt rankings?
Hey Nichank,
This is a tough one to answer, as to whether Google would look negatively on this. My bet is yes. Because the lnik is using
, and hiding content links using anchor text, my bet with recent algorithm changes, panda and freshness is that it will look upon these.
The only good thing that may save you here is they pointed 'thousands' of these links - which in itself makes the site linking to you (or sites) look like link farms - which Google would also detect and determine to be valueless' to the user - thus - nullifying thier efforts.
This might also revolve around the 'niche' industry your in, which could also play a part in Google's determination and action on such links.
This would make a really great test from an SEO perspective. I think I'll add this one to my books for 2012! If I find any results - I'll make sure to come back with some concrete answers
A good link I found to support Google penalizing the site linking to you more (because they are using hidden text and links) within their pages.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66353
-
RE: Youtube and twitter
Hey Hawk,
Better give it some time. Your profiles won't just show up in the SERP results because you create a brand account. You need to build up that account with content, video's, twitter feeds.. become active in your niche, get invovled with others who you are following, who follow you etc. It's not going to happen overnight and without some work on your part will probably take longer than you hope. No one could put an exact # to this type of request because there are too many factors. Create some create account and share some equally great things, and stuff will happen
-
RE: Hyphens in Domain Name
Hi C nature,
OK, there are many schools of thought on this, so many SEO's will have different views. Nothing wrong with that Sure, Search engines and research speculates now that exact match domain names are no longer the be-all, end all of SEO relative positioning.
I have done several tests with regards to this, building test sites that are exact vs non-exact match domains based on market research. The exact match always (90%) of the time, beat out the other versions.
All my hyphenated domains tests have always taken longer (ave 44% longer) to start ranking, gathering rankings based on KW research than the non-hyphenated counterpart.
One other thing to consider is how search engines perceive hyphenated domains. Because they have been known to be used/abused by spammers (which ultimately made them less credible from a trust perspective). This in turn was a direct correlation to the amount of time I mentioned (ave 44% longer) to get ranking.. search engines have a tendency to take longer authenticating these domain types, vs their non-hyphenated cousins.
If your looking for quicker results, perhaps focused around 1-2 main keywords, then the exact match domain (non-hyphenated) would be your best bet, to build the site/domain around. For that specific keyword or 2, it will yield the quickest results over time. Link building, social profiles etc will still need to be built out to get signals moving in the domains favor to establish itself within the SERP's. It will be up to you to build out a more structured plan around other keywords/terms to focus on for short and long tail search through content development and on-site optimization.
The competitiveness of the 'target' keyword can also play a factor into the domains non-hyphenated domain and ranking performance. If the domain also has brand level keywords (something like coca-cola-softdrinks.com might be difficult (if not near impossible) to even begin to rank for due to the brand authority recognized online of the actually coca-cola company site). It really depends on the scenario.
If your looking to build something long term, that could eventually be recognized, your best bet would be to build a site/domain out with a brand' style domain using a 'company brand name' and optimize it around that. It won't be exact match, and may take longer, but over time will build the trust using brand authority will yield better results.
If you think about it, typing in something like your example above (business-broker-alabama.com) would be a real pain in the a#$, LOL. It would either be the non-hyphenated or a brand level domain optimized around the focus on your domain example which would yield a better user experience from the get-go.
I'm thinking about re-doing my test/thesis over to determine if recent statements on exact match domains has changed in value. I think I feel inspired to do it again !! Time to start digging.
My opinion would be either non-hyphenated or brand level domain building. Stay away from the hyphenated and spammy looking domains.
Hope this helps. Cheers!
-
RE: I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
Yep, I agree with Casey.. don't go this route at all. Feature the article on 1 site only and use this to promote your 'awesome' piece of content to the world Cheers!
-
RE: Competitor purchased thousands of hidden links to our website... will it hurt rankings?
I like this answer! LOL 'go to war'.. when and if it happens you site is affected. I would also 'log' the date you caught these, so you can go back historically if traffic drops happen and associate this with the inbound link buy someone obviously did.
-
RE: Ranking Fluctuations
What's the domain you are working on and what are the KW's you are targeting. Graph's work, but without actionable information, it's difficult to look over and help make a few suggestions
-
RE: Watermarking Keywords
Nice answer Simon. I would have said the exact same thing! Avoid at all costs, would be my advice.
-
RE: 1 site on 2 domains (interesting situation, expert advice needed)
Sounds like a complex situation, but it really isn't all that hard to discern. This is the approach I would take on the matter. Looking over the initial MOZ Explorer crawl data, it's a close call for sure.
1. First, the old .org domain, which you just recently re-acquired still has old links pointing to it and good value.
2. The .info domains still don't have or generate as much 'trust and authority' as other TLD's like .com, .ca, .org. I would seriously consider moving back to the .org TLD.
3. Any links you haven't been able to switch over or have little to no control over (and would take a ton of time and resources to have switched over to the .info domain). Redirect all the pages, and the link values being passed will still count when pooled to the new .org domain. Value will still stand, even losing some value in the 301.
4. If you do decide to use the .org domain, make sure to plan out a seriously detailed 301 redirect plan (TLD, sub domains/folders and all pages) when looking to move and migrate data over to the older .org domain. Not taking the time to plan this out would cause very negative ripples in your current and future SEO endeavors This is a very careful area, but needs to be watched carefully.
5. Avoid running both sites side by side. This will surely cause duplicate content issues. Choose 1 domain, redirect all the other value, content, etc through 301's, canonical's and migration procedures and have all the value sitting within one site. Build your marketing, social and search platform around one site/brand and work from there
On a side note looking at your linking data from OSE:
Your actual main text linking revolves around your brand name almost 80-90% of the time, which isn't all bad, but you might want to start looking at alternative ways to generate links to your site, using some of your product descriptions and through content generation. Try to vary the amount of links and types of link test being used to link to you from other sites. Don't sculpt your links, but rather include ways to evolve your current linking practices.
Hope some of this input helps!
-
RE: What are some strategies to outrank your retailers who use the same page content as you?
Cheers buddy! We all work together !! Gotta love it.
-
RE: "Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
-
RE: 3rd Party hosted whitepapers — bad idea? Duplicate content?
When looking to promote your material through 3rd party sites, make sure to discuss the options of rel-can. Make sure they are OK with sourcing the original files location and URL. Usually they are OK with doing this, as it's in their best interest to do
On a sidebar: Also make sure to embed any links in the PDF, White paper, infographic, or articles etc, with absolute link practices (if pointing to information on your site) use the entire URL and not just a relative path from the file located on your site/server. Because scrapers like to just 'scrape' content and publish automatically to spam sites, without either you're or the 3rd party knowledge, those links they scrape will automatically point back to internal pages or information referred to in your white paper on your site, crediting you with the link, the value and of the source original content. I like to make sure this is always done to get any credit I can when these black hat scraper sites hit work and try to re-publish.
-
RE: Converting From Joomla to Wordpress - Worried About Falling Out Of 7 Pack
You'll have to have a very technical list of stages and list items closely monitored to ensure the best possible chance of success. In fact, in some ways, this could be a blessing if you have taken time to analyze the marketplace, the customer persona's, the sales funnel of your client, etc, etc. It obviously a tough call and you have some difficult recommendations to make to the client. I've been there!
Best thing is to be fully transparent and make sure they understand the implications of making a major move like this and the kind of time and work it will take to ensure a clean transition (well, cleanest it can be!)
You know they need it, but are worried about all the background work and critical technical steps to ensure a smooth transition to the new site.
Glenn Gabe wrote a great piece detailing some really important steps to take when going down this road! Check it out here.
Hope it helps! Rob
-
RE: "Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go (as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
RE: 8500+ seomoz errors and still rank one for high traffic keywords
I would check to see if the reporting is accurate on the competitor site/URL. Run checks to see if their duplicate content is actually 'duplicate' or if the reporting tool is just fooling you or mis-reading the data. Always good to do manu al checks.
Do that by grabbing text from the site and placing it in " " for exact match phrase search. Do you see the actual URL, and a duplicate say (like Druapl www.mysite.com/node/12345) below it .. if so, then it's pretty much slam on that there is duplicated content affecting that site.
Just 1 thing you can do to help research it out.
-
RE: How do I know what pages of my site is not inedexed by google ?
Hi Sina,
For your first question, make sure you have Google Webmaster Tools setup (which I gather you do) as you have received a 'low quality/spam links' message by them. I should add that dealing with an 'unnatural link profile by Google is a whole other project!) and super important to boot so get on top of that also! Open Site Explorer is a perfect place to start, to crawl the links and to profile your entire linking domain profile. From here you can begin to examine domain link profile by filtering through options to identify ones which may be causing you that warning from Google. This will need to be rectified in order to ensure solid indexing of your site pages. You will need to clean these up in order for the rest to work and be effective
Now, to look at the indexing issue you asked on. If you look to the right in Webmaster Tools once you login, on the dashboard, you will see a section called SITEMAPS (3rd on the right once you click into the domain) from the main panel. Click on the TITLE of this section from the dashboard, and you will land on the SITEMAPS report file. There is a wealth of information here from Google about the indexing health of your site.
There are 3 steps here, Google needs to have done in order to identify which to help you figure out the information you are looking for:
- Crawling
- Indexing
- Ranking (what you see in the SERP results pages using search terms or Google Operators for site review.
In order to see any results at all, you need to ensure you have a SITEMAPS.XML file built, loaded and submitted to Google. It also needs to be configured properly and have no errors for proper processing. This is the only way you will get clear snapshot of what has been indexed based on your XML file by Google. This will tell you have many pages you have indexed in their index, but not identify. If you don't have any at all, it will state it.
it's also time to look at your robots.txt and .htaccess file to ensure those are configured and installed properly. This would be another troubleshooting step, but seeing as you have a unnatural link profile, you may want to take these steps first. Ensure you don't have any of the <noindex>meta fields listed here as well site-wide.</noindex>
So, from here, once you login to Webmaster Tools (dashboard for the site you are referring to you) under SITEMAPS, you will see a section saying XXX number of pages submitted and XXX # of pages indexed along with any errors and warnings you are getting from them now in that box (link warnings will be here too!). This will give you some important informtion which you can log in an Excel file later Here is where you will most likely see that linking domain link alert from Google as well.
Now you have Google's 'indexed pages' view. Now you have to dig a little.
----- GOOGLE OPERATORS ---- Now, once you have some data from Google WebMaster Tools as mentioned above, You can now go to Google.com (or the Google index you want to see like .ca. or others) and use Google search operators to speficially see which URL's and pages have been indexed by the engine. There are a few different ones you can use below. I found a great resource below and copied in the link.
Domain search with - site: Operator
(site:google.com)
This should returns results only from the specified Domain.
So you will need to be careful if your site is with a SubDomain (or multiple SubDomains) ("www" is a SubDomain).Domain search with - inurl: Operator
(inurl:google.com)
This should return results that contain the specified Domain.
This may not be only from the site in question though! It is possible for other sites to contain your domainname in their URLs (whois.domaintools.com may have such URLs etc.)Domain search with - site: and inurl: Operators
(site:google.com inurl:google.com)
This way you limit the results to your Domain Only ... and it seems to generate more "reliable" results than the site: operator alone.Domain and Path/Query search with - site: and inurl: Operators
(site:google.com inurl:/somepath/somedirectory/)
(site:google.com inurl:?this=that&rabbits=lunch)
This way you limit the results to your Domain Only ... and focus on a specific directory/folder or set of paramters etc.Domain and FileType search with - site: and filetype: Operators
(site:google.com filetype:html)
This limits the results to those from your Domain, and to a specific type of file.
Please note - the filetype: operator may not show All of that type - it may only work for URLs that end in that type. thus if you serve content as html, but without the .html in the filename - they will not show in the results!)Domain and Path/Query search with - site:, inurl: and inurl: Operators
(site:google.com inurl:google.com inurl:/somepath/somedirectory/)
(site:google.com inurl:google.com inurl:?this=that&rabbits=lunch)
This permits you to start limiting the results to specific parts of your site if you need too.Make sure that your site pages also don't include in the section the <meta-noindex>or <meta-nofollow>tags. This would tell Google not to index or follow the pages from your site </meta-nofollow></meta-noindex>
Ensure that you have, in your .htaccess file the proper redirects for the site if you find you have duplicate content. Ensure you are 301 redirecting the non-www to www versions of your site and pages (or vice-versa), whichever you prefer to have indexed by Google to ensure clean indexing of the site. This will make sure you don't have problems indexing wide for search.
TO NOTE
---- SERVER LOG FILES ---- (Note: please make sure that you request log files) from your hosting company too. If you don't have access to server log files for hosting traffic, switch! Log and keep an eye on these as well for information for your needs. This process is not a fast or easy one and does require some work to detect. Don't get lazy. This is a crucial step to keep an eye on.
What I recommend next is starting to keep log files if you aren't already and tracking those on a weekkly pr monthly basis (which ever is easier). The reason being is once you get indexed to Google, you always want to keep an idea of what is indexed and what isn't (dropped) or de-indexed pages. This can also help identify early problems (or penalties) from Google if you see trending things happening day over day or week over week.
Hope this helps point you in the right direct. Remember don't be lazy here Exhaust all options to indentify your problems! Cheers,
Rob
-
RE: How is this possible? A 200 response and 'nothing' to be seen? Need help!
I ran a speed test on the domain as Chris mentioneed it was running slow. I did get the domain to load, but it took a lot of time to get a visual of the site's design. Try using the following tool to run some speed tests, determine where things might be slowing down (host, server, # of files loading, # of image files loading, quality of images, resolutions, remote files and CCS scripts, etc).. could be a # of things but this is a good place to start investigating. Just enter your domain here and run the test.
It will also help you identify areas that might need looking at to help speed things up. Hope this is a good jumping off point
Cheers
-
RE: Not ranking in Google - why???
Just optimize your pages to focus on INTENT now, rather than specific keywords, thus reducing the chance of keyword cannibalization. Have an idea where you are going for each page, but really narrow down and focus the efforts. It's not really all that long, but Rand did an excellent write up on this (it's from 2007), but worth exploring further for sure to get the basics down. There is also a WB Friday from March this year where he touches on it. Anyways, hope it helps!
-
RE: GWT and html improvements
Had a few minutes and wanted to help out...
Google doesn't always index/crawl the same # of pages week over week, so this could be the cause of your indexing/report problem with regards to the differences you are seeing. As well, if you are working on the site and making changes, you should be seeing these numbers improve (depending on site size of course Enterprise sites might take more time to go through and fix up, so these numbers might look like they are staying at the same rate - if your site is huge
To help with your 301 issue - I would definitely look up and download SEO Screaming Frog. It's a great tool to use to identify potential problems on the site. Very easy to download and use. Might take some getting used too, but the learning curve isn't very hard. Once you use it a few times to help diagnose problems, or see things you are working on improve through multiple crawling. It will allow you to see some other things that might not be working and get to planning fixes there too
As well, make sure to review your .htaccess file and how you have written up your 301's. If you are using Apache, this is a great resource to help you along. Read that 301 related article here
Make sure to manually check all 301 redirects using the data/URL's from the SEO Screaming Frog tool. Type them in and visually see if you get redirected to the new page/URL. If you do, it's working correctly, and I'm sure it will only be a matter of time before Google fixes their index and displays the right URL or 301. You can also check this tool for verifying your 301 redirects using the old URL and see how it performs (here)
Hope some of this helps to get you off to working/testing and fixing! Keep me posted if you are having trouble or need someone to run a few tests from another location.
Cheers!