I think you should share you sites and blogs (with links) to help us get a view and a better picture of what might actually be causing the issue
Posts made by RobMay
-
RE: Rankings Drop Penguin 2.0
-
RE: 3rd Party hosted whitepapers — bad idea? Duplicate content?
When looking to promote your material through 3rd party sites, make sure to discuss the options of rel-can. Make sure they are OK with sourcing the original files location and URL. Usually they are OK with doing this, as it's in their best interest to do
On a sidebar: Also make sure to embed any links in the PDF, White paper, infographic, or articles etc, with absolute link practices (if pointing to information on your site) use the entire URL and not just a relative path from the file located on your site/server. Because scrapers like to just 'scrape' content and publish automatically to spam sites, without either you're or the 3rd party knowledge, those links they scrape will automatically point back to internal pages or information referred to in your white paper on your site, crediting you with the link, the value and of the source original content. I like to make sure this is always done to get any credit I can when these black hat scraper sites hit work and try to re-publish.
-
RE: What are some strategies to outrank your retailers who use the same page content as you?
Cheers buddy! We all work together !! Gotta love it.
-
RE: What are some strategies to outrank your retailers who use the same page content as you?
If he is the one producing the content (from FDA), and it's originally written by FDA approved staff, and other sites are scraping and using his content - (providing you are the first to be crawled and indexed/noted with this content), you will be credited with the content originality. The technical structure of your site will have an impact as well as the social landscape and the way people perceive your brand online.
That's all I wanted to add to that.. Don't worry so much about the content if its your original stuff. If they are ranking higher - take some of my other suggestions, build a company blog, share news and source info about the products, highlight products, specials and promo's. Build a community people will want to come back to buy, share, and support. Build links to the content in the blog on the company site, share the info, get you product our into the right niches and circles and I think it will all come together with time.
SEO isn't an overnight success, and I can't stress that enough to clients i work with. It takes time and patience to succeed with results that stick
Cheers!
-
RE: What are some strategies to outrank your retailers who use the same page content as you?
That's pretty straight forward and almost answers itself
What you need to do is rework all the product descriptions you sell (that are/might be duplicated on 1 or 25, 50 of 100 sites). You need to work on then optimizing the framework of your site (technical work), write great compelling titles and description tags, good H1-H2 usage throughout, copy placement, integrate UGC (user generated comment) to write product reviews and notes on current inventory or sales experiences, and of course have some sort of plan sketched out for a social media and marketing push to build awareness and traffic to the site - it'll be difficult. All the while and doing all this you need to consider some sort of content development strategy (blog?) which could highlight certain products daily? weekly? monthly? Integrate that into a media delivery email campaign to gain new potential sales leads from site visitors, offer promotions too, etc? Opportunities are endless here and sometime frustrating as you work through and find the right formula that works.
I always fine these problems fun to work out. Sometimes I get carried away!
It's not an easy job, I won't lie. I recently had a client on a site for a client like that with over 1000 individual products (and that's small compared to some enterprise-level sites now) and it's a long process to do, but as you work through it, optimize it, test it, re-test it, re-work it again and again, you'll find the right formula to get those page indexed and ranking. Adding that personal touch to descriptions by re-writing them as well, gives you opportunity to leverage other potential KW's for POW (points of entry) from/through organic search, keeping in line with the primary KW's you are trying to target.
Create a great user experiences as well Look to see what the other sites are missing. Look at the site from a 'buyer' perspective', what can you improve on? image galleries? sign ups? search function to improve product location? etc.. I don't think the other sites who are ranking above you, have all that nailed down, unless they are a global client like Target, Walmart, etc, etc. There is always room to improve !!
Hope some of that helps. It should get you started for sure!
-
RE: Punctuation at the Start of Page Titles
1. Remove the punctuation. Although it doesn't really damage search listings or impact how SERP's look at your site for rankings, as Chris said, you only have so many characters to work with in the <title>field and it's best to really optimize the <title> to improve end-user experience :)</p> <p>2. Craft custom <titles> for each and every page, and consider where you place the KW in the field. Importance will be taken into account as well as position and meaning of the KW in relation to the <title>. Try mixing things up to see where you impact ranking positions. I would still remove all punctuation (but perhaps, keep a few pages ranking now, with punctuation to see if you impact the rankings) See #3 below.</p> <p>3. Look at choosing a few test pages in the domain to work with to monitor rankings for this very test, and analytic's data like bounce, exit, click through, etc. </p> <p>4. Doing this will also help you reveal how the customer reacts to the page once they click in, after the find it in the organic SERP listings. Did the punctuation impact your rankings, and if so, was the click through higher, while also decreasing the bounce and/or exit rates from said pages from end-user? A great experiment and test platform :)</p> <p>It's not an exact science, but more a art and science mixed together ;). I wish you all the best with this, as it sounds very interesting. Keep us all posted on your findings!!</p> <p>Cheers.</p></title>
-
RE: How is this possible? A 200 response and 'nothing' to be seen? Need help!
I ran a speed test on the domain as Chris mentioneed it was running slow. I did get the domain to load, but it took a lot of time to get a visual of the site's design. Try using the following tool to run some speed tests, determine where things might be slowing down (host, server, # of files loading, # of image files loading, quality of images, resolutions, remote files and CCS scripts, etc).. could be a # of things but this is a good place to start investigating. Just enter your domain here and run the test.
It will also help you identify areas that might need looking at to help speed things up. Hope this is a good jumping off point
Cheers
-
RE: Social Media Tracking
A great place to start, just remember that tools like this usually come with a yearly or monthly fee, so budget for it Enjoy. I like using these tools.
-
RE: Question Mark In URL??
Jesse's right. AS much as your client doens't want to upgrade and rework the entire site (build an equivalent in HTML or PHP), there isn't much you can do. This is a full FLASH site, locked SWF and Google isn't going to crawl or index any of the content or information.
You could do a few other things to help the business on a local level:
1. Build up all the social profiles and media needed to support Google local search. Social media, Google+, FB and Twitter sould be a good start. Even a LinkedIn profile to support the company and business.
2. Add in a WORDPRESS customization feature to the site, and build up a blog for content marketing and development. Work to create content around each of these categories and redirect users back to the company site. You don't have specific landing page URL's to use and optimize, but it's a cost effective start if they are unwilling to bend on going the route that will benefit them the most.
I've had clients like this and it's the hardest thing to tell them everything they have or are doing is wrong on many levels. It's probably the most sensitive area when dealing with a client you don't want to upset Tough road ahead for sure.
Cheers!
-
RE: The impact of using directories without target keyword on our Rankings
It all depends if you want (or are going too):
1. Short URL's usually work best with regards to indexing and product correlation (too long means characters get left off by Google when indexing). Keep things within a short URL length also helps Google index the full length and get the full value of the URL - using your <keywords>to reinforce the URL relation.</keywords>
-
Also - Having these URL's linked too from the main page will help flow 'link juice' through the site, providing you keep the amount of links on the homepage to a minimum amount, and mix with other links that are <nofollow>. Usually links beyond 100 will not be crawled by Googlebot.</nofollow>
-
Also - If your URL's are strings - make sure to have 301's setup for URL's that include any type of string (?=question123456 or something to that alignment) Make sure to change that string = www.domains.com/keyword-rich-content. This might be nothing for the site/domain you are working on, or might be a step that needs to be included in the site's overhaul project work.
2. Longer URL's (like adding directories or sub-folders) can be good too, depending on your product breakdown in you site architecture. It might not be needed though. If you have hundreds of thousands of products, directories will most likely be needed to sort the data and organize the database being used to work alongside the CMS. Then you would want to go this route, other than having an unorganized ROOT directory with thousands of pages in it (even if dynamically generated)
Each option works, in their own way. Each with supporting documentation and methods. Just something to consider in helping you steer the SEO sea
Cheers!
-
-
RE: Emergency duplicate of website due to DNS failure - how to minimise loss of search engine traffic?
Sorry to hear it. Been there, and had to deal with it also! Read this over to help you understand the best practice. It's a great learning piece that will help going forward SEO Redirection practices and the SEO guide to HTTP status Codes by Dr. Pete. Another great piece and article.
It sounds like the a 302 temporary redirect is the best practice if you are only redirecting the site for 4-5 days. Then you can remove the redirect after the site is back up after the DNS failure taking place.
Cheers!
-
RE: Subdomains or subfolders for language specific sites?
Just something to add if you want to consider this in your planning scope. My proffesional opinion is surely sub-folders, keeping away from sub-domains (unless for a very specific reason needed).
1. If you are building a multilingual site, you should also consider how you will be building back-links through various methods (content development, company blog, etc) and will those be done in each language to support the authority needed to build rankings in each language you are working to expand on?
2. If you are going to be building different divisions of the domain (Chinese, and perhaps later going French, German, etc), consider building out TLD's for each domain that's needed in specific languages and focus on building those separate sites. That way you can focus on content development, social media and marketing efforts specific to each language, thus improving your options for the search in each country.
Just some thoughts to think about depending on the scope of your project and sites.
Cheers!
-
RE: If i get link pointing to a specific page , will that link add up to the count of linking root domains too ?
sub-domains isolate the link value and don't allow for the flow of value into the main domain. It's best practice to use sub-folders instead on the main domain that allow for link value or 'juice' to spread through the site - thus - building trust.
Unless a sub-domain could hold and contain it's own content to work apart from the primary root and gain trust and value, I would go the route above. Every situation needs to be clearly identified and reviewed.
-
RE: On-Site Directory - Delete or Keep?
If you want to go through the 1000's of links you have and identify the ones that are bad, or from bad neighborhoods, make sure to spend the 50$ on a report from Link Detox (http://www.linkdetox.com/). This will help you identify things quickly and allow you to make some quick decisions on link profile management. If it's a big project, take a 1 month subscription to the software, and make sure use it fully to identify the entire problem. Don't forget to use the link disavow tool if needed (if you are looking to keep any valuable links) or try to isolate them to contact later and help fix the issue.
If you plan on getting rid of them all - just dump the entire thing and start from scratch, but it'll be a long road back with best practice work.
Hope it helps! It's worth the $$$ and investment when you run into problems like Penguin or Panda
-
RE: Cross-Site Links with different Country Code Domains
Thanks. Great point I neglected to mention in detail. !!
-
RE: Cross-Site Links with different Country Code Domains
Absolutely. Probably your best bet.
On a sidebar note, try to keep your link count on any domain pages to a maximum of 100. Anything above and beyond that, are not counted on pages and usually the BOT won't follow anyways. The more links on a page, the less value can be attributed to them.
If you want to help the link profile of the site, decide which links could be <nofollow>and place those tags to help with improving the link quality on various pages. This also helps Google with trust as you are not trying to 'game' the linking profile of the site or structure to improve various pages SERP rankings.</nofollow>
-
RE: Should I put rel=publisher on UGC?
I would surely (and only) use the rel-publisher field on the professional writer content and tie that into the Google+ profiles to have your listings and content stand out in SERP results, but would stay away from trying to get that tied together in the UGC section. Just my 2 cents, but without regular postings by UGC and tying that into the Google+ profiles of those writers who contribute, I don't think you would get the value back.
If these writers in the UGC are frequent, invite them to contribute to the PRO section of the site as 'guest' bloggers or writers. Just some suggestions I would follow myself Hope it helps!
Cheers!
-
RE: Google showing wrong title
Mike and Jesse are both correct!
Here's an old video from Matt Cutts and Google which might help explain a little further what is happening and why. https://support.google.com/webmasters/answer/35624?hl=en . This talks about the site TITLE and DESCRIPTION elements for Google indexing. Always a good reference point
Another good article I have referenced before is: http://www.blindfiveyearold.com/google-changed-my-title (from 2011)
Thanks!
-
RE: "Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..
Thanks anyways, glad I could be of some help.
-
RE: Cross-Site Links with different Country Code Domains
Hey there,
Are you linking these domains just form the main level TLD (homepage), or all the product pages from within each company/country site.
If yes, absolutely! Use <nofollow>on any of the domain/brand or country level TLD's to avoid any kind of penalty. Penguin being as picky as it is, you would want to stay away from any of that to in a 'safe' zone. I would stear clear and use these best practices for interlinking cross country domain sites and products. </nofollow>
Work on building your outside links to the site through content development and social media/mentions, while using the <nofollow>links within the site at the brand and product level. </nofollow>
Thanks, Rob
-
RE: "Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go (as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
RE: "Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
-
RE: Turning off a subdomain
Sounds like the indexing issues are causing some drops in ranking, even though good based content and domain authority are present.
Also, the .v1 site looks to be a testing platform?Could that be possible? I recently had an issue with an enterprise client site with very similar issues - with multiple testing versions of the domain up and indexable, causing massive amounts of duplicate content, indexed content and indexing issues.
I would plan to assess any content that could me migrated over to the main site from the .v1, and 301 redirect (and rel-canonical) the old .v1 site pages. Keep those in place for a few months to ensure that all the value of the 301 take effect.
By migrating some of this valuable content over (or all of it), just make sure you use both properly executed 301 redirects, and to take it a step further, apply the canonical tag on the .v1 pages with redirects to the exisiting and correct pages on the main domain. This way, we know for sure all any value is being passed.
SIDE NOTE: Having that many pages, indexed content doesn't mean the site will do well. In fact with this poor setup, the site's massive amount of page URL's might be causing more damage. Too many pages will bad page quality scores can and will bring a site down. Plan to migrate the pages or sections of the site to the main domain (that hold the most value), 301 and rel-canonical the other's, and remove the bad pages with little to no value that may be causing site wide damage in search indexing.
In dumping lots of content from the site - redirect those URL's (being dumped) to a helpful 404 page, which will try to salvage any user hitting the page, and redirecting them to back into sections of pages of the site. Also - make sure that page has a useful 'search' option that is clear to allow them to search for something they might have tried to land on through organic indexed content.
Finally, once you see indexing improve and redirect those pages automatically in the SERP's through reporting in weeks or months to come, then you can shut down the old .v1 pages without fear of losing any value you had.
It's a lengthy process and a big project, but the client (and site) should see huge value in the time you are taking to manage it. It will maintain value for the site in the long run and help build a better platform going forward.
Cheers!
-
RE: 1 site on 2 domains (interesting situation, expert advice needed)
Sounds like a complex situation, but it really isn't all that hard to discern. This is the approach I would take on the matter. Looking over the initial MOZ Explorer crawl data, it's a close call for sure.
1. First, the old .org domain, which you just recently re-acquired still has old links pointing to it and good value.
2. The .info domains still don't have or generate as much 'trust and authority' as other TLD's like .com, .ca, .org. I would seriously consider moving back to the .org TLD.
3. Any links you haven't been able to switch over or have little to no control over (and would take a ton of time and resources to have switched over to the .info domain). Redirect all the pages, and the link values being passed will still count when pooled to the new .org domain. Value will still stand, even losing some value in the 301.
4. If you do decide to use the .org domain, make sure to plan out a seriously detailed 301 redirect plan (TLD, sub domains/folders and all pages) when looking to move and migrate data over to the older .org domain. Not taking the time to plan this out would cause very negative ripples in your current and future SEO endeavors This is a very careful area, but needs to be watched carefully.
5. Avoid running both sites side by side. This will surely cause duplicate content issues. Choose 1 domain, redirect all the other value, content, etc through 301's, canonical's and migration procedures and have all the value sitting within one site. Build your marketing, social and search platform around one site/brand and work from there
On a side note looking at your linking data from OSE:
Your actual main text linking revolves around your brand name almost 80-90% of the time, which isn't all bad, but you might want to start looking at alternative ways to generate links to your site, using some of your product descriptions and through content generation. Try to vary the amount of links and types of link test being used to link to you from other sites. Don't sculpt your links, but rather include ways to evolve your current linking practices.
Hope some of this input helps!
-
RE: Keep the blog separate or incorporate into main domain?
Absolutely! This is probably a priority project in my eyes if you are serious about building this up and going in the right direction now.
Moving the BLOG to your domain.com will not only improve the relation to your site and content, but brand it as well. Having the BLOG on another domain (as it is now) might help you from a link building perspective (more than likely not helping very much) due to the fact Google would know you own that domain, and perhaps place less emphasis on the inbound links to the main site.
First - setup your site/domain to host the BLOG as you mentioned. www.domain.com/blog. There are a few reasons for this.
1. It will allow you to reap the rewards of the content you build (supporting your site/company/mission) and support the main domain, and it's
2. The content you build, share and move into the social sphere and space will allow for inbound links to be shared across the entire site/domain. (don't build it into a sub-domain like blog.domain.com as this is considered to be a completely separate domain and won't pass any link value and juice across the domain). sub-domains are considered to be domains by themselves in the eyes of a search engine like Google.
3. The content you build (and load daily to the BLOG) will keep search spiders coming back for more
4. Keeping the BOG content on the main domain, allows you to share the content of the BLOG to your web visitors who might be searching via BRAND or specific, helping associate the brand with the content marketing resource you are building. This in itself is a gold-mine, as it will also act as another source for long-tail traffic opportunity, but you'll have to do your due diligence on this from a research perspective to capitalize on the traffic.
5. Content marketing is going to be BIG (as it's already on the rise and exploding now). This will all fall under your efforts for the BLOG and should be focused on. The value here is that over time, Google will begin to apply TRUST and AUTHORITY factors to the content you write and submit - helping to support your brand as a quality resource of shared information for people.
Make sure to use rel=canonical from the old location URL's and point those over to the new URL's on the domain.com/blog listing. Also make sure to use rel=author for each of the articles on the new blog from a META position.
NOTE: Handling a migration such as this is very complex (especially if the BLOG is extremely large with thousands, or 10's of thousands of posts and articles). Even a few hundred can be TRICKY!
Not only do you have to setup the files (BLOG) on the new domain, but you will have to write and execute 301's for every single article on the old location/domain and point that over to the new one. Some CMS's like Drupal can assist with this if your programmers can handle writing the scripts. This is a very technical undertaking, so make sure to do your research. If it's a small resource, you can still follow the same protocols, but it will take much less time to complete.
I recently handled and oversaw a technical project like this months back that took quite a long time to do, troubleshoot as it had over 30K articles and 25K in image files over the past 6 years! It was a huge undertaking that went extremely well, but you have to be patient.
Hope this helps some! There's probably a little more I didn't mention as I'm late for a meeting with a client!, so if you have questions - let me know!
Rob
-
RE: Why do most Local Directories turn around and lie and try to steal your clients?
I personally love hearing from clients on things like this. They trust you enough to contact you and say "hey, we got this email, but it isn't making sense... looking over our ranking reports and based on our sites perforamance we are doing well in Google (or any other engine).... )
If your clients have been working with you for some time, you have built up that trust with them based on work, results, repore etc.. I wouldn't worry too much about these trolling directories..
If your clients do in fact believe what they read through random spam emails, then it's probably better to let them go.. it'll be less of a headache in the end
Nice rant tho! Makes total sense to get annoyed.
-
RE: Why do most Local Directories turn around and lie and try to steal your clients?
Too funny, but also totally true!
-
RE: Recommended Custom Reporting Tool?
Harmony, I'm not exactly sure what you mean? Are you looking for a 'ranking tool' which you can use to report search positions to your clients? Wasn't very clear so I thought I would reach out.
-
RE: Hyphens in Domain Name
Hi C nature,
OK, there are many schools of thought on this, so many SEO's will have different views. Nothing wrong with that Sure, Search engines and research speculates now that exact match domain names are no longer the be-all, end all of SEO relative positioning.
I have done several tests with regards to this, building test sites that are exact vs non-exact match domains based on market research. The exact match always (90%) of the time, beat out the other versions.
All my hyphenated domains tests have always taken longer (ave 44% longer) to start ranking, gathering rankings based on KW research than the non-hyphenated counterpart.
One other thing to consider is how search engines perceive hyphenated domains. Because they have been known to be used/abused by spammers (which ultimately made them less credible from a trust perspective). This in turn was a direct correlation to the amount of time I mentioned (ave 44% longer) to get ranking.. search engines have a tendency to take longer authenticating these domain types, vs their non-hyphenated cousins.
If your looking for quicker results, perhaps focused around 1-2 main keywords, then the exact match domain (non-hyphenated) would be your best bet, to build the site/domain around. For that specific keyword or 2, it will yield the quickest results over time. Link building, social profiles etc will still need to be built out to get signals moving in the domains favor to establish itself within the SERP's. It will be up to you to build out a more structured plan around other keywords/terms to focus on for short and long tail search through content development and on-site optimization.
The competitiveness of the 'target' keyword can also play a factor into the domains non-hyphenated domain and ranking performance. If the domain also has brand level keywords (something like coca-cola-softdrinks.com might be difficult (if not near impossible) to even begin to rank for due to the brand authority recognized online of the actually coca-cola company site). It really depends on the scenario.
If your looking to build something long term, that could eventually be recognized, your best bet would be to build a site/domain out with a brand' style domain using a 'company brand name' and optimize it around that. It won't be exact match, and may take longer, but over time will build the trust using brand authority will yield better results.
If you think about it, typing in something like your example above (business-broker-alabama.com) would be a real pain in the a#$, LOL. It would either be the non-hyphenated or a brand level domain optimized around the focus on your domain example which would yield a better user experience from the get-go.
I'm thinking about re-doing my test/thesis over to determine if recent statements on exact match domains has changed in value. I think I feel inspired to do it again !! Time to start digging.
My opinion would be either non-hyphenated or brand level domain building. Stay away from the hyphenated and spammy looking domains.
Hope this helps. Cheers!
-
RE: Domain Authority and nofollow links?
Yep, having a natural link profile is what it is all about. No follow and follow links, social media links, text and website links, forum and blog links. It's all comes together in the link graph overall, as a whole - and allows the search engines to calculate.
No follows, don't actually pass any 'juice' on a link level, but having these signals allows search engines to calculate the nofollow into the variable, and give weight in itself.
It's a wonderful thing
-
RE: Fading Text Links Look Like Spammy Hidden Links to a g-bot?
Not sure why you would want to use this technique on a 'navigation'? It's not great from a user experience perspective. I would use something like this for 'text rotation' on things like testimonials, or reviews to keep fresh content loaded to the domain (switching it up every few days).. I wouldn't use this for navigation on a site.
Just my observation. Cheers!
-
RE: Analytics to Excel
I miss this tool from my PC !! I would love for them to develop a MAC version now but that just ins't a reality yet. Nice suggestion. I would have done said the same thing!
-
RE: Robots.txt question
Oh - and it's affect the domain negatively.. when cleaning up your site directories via robots.txt. Its actually better as I explained below
-
RE: Robots.txt question
Hey Mark,
It's good practice to disallow access to any folder/content you don't want indexed as well as anything that has any security involved (login's, databases etc).
It will also keep the most important pages from the domain in front of the search spiders eyes, while keeping poor content out of the indes. This helps the domain on a site authority level provide valuable content and information to users.
Lower ranking pages, can cause the domain to be pulled down by serarch engines (Google and Bing have attested to this already) as they want businesses to focus on high value content - which leads to better user experience.
Cheers!
-
RE: Robots.txt versus sitemap
I would also take the time to clean up your XML Sitemap file for crawling, just in case. It'll be better for you to keep track of any files/URL's you don't want indexed by the search bots.
Just good practice
-
RE: Watermarking Keywords
Nice answer Simon. I would have said the exact same thing! Avoid at all costs, would be my advice.
-
RE: New site not ranking for it's name
Hey Mark, exactly. Keep your KW target focused, and work in long tail options to gain a few positions on that level, while building up the brand site, level, content strategy etc.
I would use this targeted style on each and every page of the domain, to help with consistency as well. You can explore tougher keywords later as the domain gains some authority..
That should help you out and get the domain ranking for it's brand. The hyphenated domain will take longer tho, based on all my tests. You need to work in signals which authenticate that with the domain/brand.
Cheers!
-
RE: New site not ranking for it's name
HI Mark,
Give it some more time. A few weeks isn't that long, even with the rate at which crawlers on the web. The domain is also hyphenated, which search engines take longer to index. It's a trust issue (spammers use hyphenated, or multi-hyphenated domains often), which causes.
I would suggest build a few links online to the domain (social, news, PR, article, forums and blogs) that link to the domain using the company Brand name. This will help raise a few signals to the engines about the domain, it's authenticity (about brand). More than likely, within a few days to a week after some work has been done, will begin to gain some traction.
Also - note, on the homepage of the domain, use a structured TITLE tag, and include the brand name at the tail end within the 70 char limit. Following this format for the whole domain will also help the domain gain trust to the engine - as the domain is hyphenated. My research and tests for this have always proven successful when working with these types of domains. Just takes longer to gain that trust.
Keyword - 2nd keyword - brand name.
Doing a mix of these, will help the domain start to gain traction for the brand name and populate that through the engines rankings.. shouldn't take long after that.
Because this domain is hyphenated - I would also register the domain name for 10 years in advance. This will help show the engines that it's planned to stick around for the long term.
Hope all this helps Cheers!
-
RE: Is there a tool that can take all your backlinks and categorise them into categories?
As Russ said, there really isn't a tool for this. It takes time, patience and good data sorting after exporting the needed info. I currently have one of my SEO's working on a project very similar to this (researching, analyzing, exporting) back links and breaking the 5K down into categories so we can see the various information. Most of the backlink analysis was done with Moz, and Raven - and he did a great job presenting the data to me this week on Tues.
It's a long process, but worth the time, effort etc. You'll really get a good look at the data and with the various outputs from Moz in the OSE tool - you'll see domain and page authority as well to help prioritize them as you break them down into various spreadsheets.
I think if you go this route, your data will be much cleaner and prepped for use.
-
RE: Domain Suggestion Tool?
Agreed. Build a potential list in Excel, and use a free source tool to plug various one's in to hopefully find somethat that works.
I would do the market and keyword research first for whatever you are trying to buy/build, to ensure you get the best possible mix for domain name to keyword relevancy
Other than that, any domain tool with most companies will work in order to spit back variations, or substitutes if certain ones are not available.
Cheers!
-
RE: Setting up Google Analytics default URL
Hi Cindy -
Why would you set the default URL to non-www, and then set GA to track the WWW address? I think you should send all your traffic and tracking to one or the other. That's the first step in cleaning this up.
I would suggest mapping everything to the www version as you have this set in for your 301. Sooner than later would be best to get accurate tracking started.
oh - and fix the tracking code - to include the WWW version and not the non-www. That also needs fixing.
Cheers, Rob
-
RE: Unexplained spikes in Google Analytics
Many people experience the 'google Tease', where for days at a time, in blocks you will sometimes see these bumps in traffic. A recent article talking about this, I put in below.
I bet, more than likely - this is the case.. but you could go back over trending data and look to see if it has happened before and if you see patterns in the these traffic increases
http://www.seroundtable.com/google-tease-14587.html
Cheers. Hope this helps a little.
Oh - I should also add that you should block any and all IP's in GA tracking that might screw the data (remove your IP, friends or office IP's, network IP's from people you know etc)... so clean the data up.. if you haven't already
-
RE: Youtube and twitter
Hey Hawk,
Better give it some time. Your profiles won't just show up in the SERP results because you create a brand account. You need to build up that account with content, video's, twitter feeds.. become active in your niche, get invovled with others who you are following, who follow you etc. It's not going to happen overnight and without some work on your part will probably take longer than you hope. No one could put an exact # to this type of request because there are too many factors. Create some create account and share some equally great things, and stuff will happen
-
RE: I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
Yep, I agree with Casey.. don't go this route at all. Feature the article on 1 site only and use this to promote your 'awesome' piece of content to the world Cheers!
-
RE: Competitor purchased thousands of hidden links to our website... will it hurt rankings?
I like this answer! LOL 'go to war'.. when and if it happens you site is affected. I would also 'log' the date you caught these, so you can go back historically if traffic drops happen and associate this with the inbound link buy someone obviously did.
-
RE: Changing to a new Google Analytics Profile - Will I lose my history?
Thanks for clarifying Megan! I'll make sure to keep that in the back of my head for next time when helping out. I really wasn't sure, but had a feeling the data would be lost when a new profile was created.
Cheers!
-
RE: Competitor purchased thousands of hidden links to our website... will it hurt rankings?
Hey Nichank,
This is a tough one to answer, as to whether Google would look negatively on this. My bet is yes. Because the lnik is using
, and hiding content links using anchor text, my bet with recent algorithm changes, panda and freshness is that it will look upon these.
The only good thing that may save you here is they pointed 'thousands' of these links - which in itself makes the site linking to you (or sites) look like link farms - which Google would also detect and determine to be valueless' to the user - thus - nullifying thier efforts.
This might also revolve around the 'niche' industry your in, which could also play a part in Google's determination and action on such links.
This would make a really great test from an SEO perspective. I think I'll add this one to my books for 2012! If I find any results - I'll make sure to come back with some concrete answers
A good link I found to support Google penalizing the site linking to you more (because they are using hidden text and links) within their pages.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66353
-
RE: Opportunity for Redirect?
Hi Matt,
This can be a great thing for you - if the old dormant domain is till being crawled, still indexed, and still has inbound links built to the site over time.
Adding a page to his site, and with unique relative content might pass back some value - but if the domain has been dormant for some time - the 'juice' value has probably fizzled a little (along with the site's rankings) for the market KW's he was targeting.
Is he planning to give you the domain? Perhaps sell it to you? If so, and he wanted out - I would do a domain and page by page level 301 redirect from his old dated site - to your domain. This would help pass through all the page link value as well as the domain - helping correct any 404's errorrs that might occur and help search index'es to map out the new 'address' of the pages on your site as the location - also pushing through those links and value to your domain.
Run a 'crawler' on his domain to map out the whole site - and then create an excel doc to map any relavant pages, info etc to pages on your domain and prep a 301 file to handle the redirects.
If that's a route he's willing to go, that's more or less what I would do to salvage any value from his domain/site and pass it along to yours.
If not - a page on his site, with a link back, will do just that - provide a link back from an older/dated and not maintained site which won't do all that much for you in the grand scheme of things.
Hope that helps