Try this:
User-agent: *
Disallow: /
User-agent: rogerbot
Allow: /
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: SEO Consultant
Company: Maxweb Solutions
Maxweb | SEO Wirral & Liverpool
Favorite Thing about SEO
The ever changing algo!
The main thin I consider is that the maximum size of the title is defined by the pixel-width, not by the character count. For this reason, I prefer "|" as you can fit more on it. But for individual services on a brand with a short name, "-" can be preferable.
Thanks for the reply. The problem is that it is a local business and looking to rank well for local results.
1. Too late, the domains are pretty close to each other.
2. The content is unique on both websites, but some of the categories and product names are similar
3. Both sites need to rank well for local searches. Using this advice means that the 2nd website will not be able to perform well locally. Am I right in saying that the websites can not effectively co-exist with the same details?
As I say, both websites are on page 1 for their primary keywords but are struggling to get near the top of the page. Is there anything I can do or will Google always treat them as duplicates and hold them back?
I like to use screaming frog or xenu. They basically crawl the links on your site and give you a report on the results. To put it simply, if pages do not show up in the results then Google probably wont either. Be careful though, the free version of screaming frog only crawls up to 500 pages.
You can also look at individual pages by typing "cache:[url]" into the address bar in chrome. This displays Google's latest index of the page. If it shows a 404 error then it has not indexed it - although this will often be the case with new products as Google just hasn't indexed it yet.
One of our clients decided to launch a 2nd website to market specific products and services that they provide. The trouble is, they have the same address, phone number and have a similar name. Whilst we have had some success and both websites are on page 1 for their primary keywords, I have a bad feeling that they may have hit a glass ceiling.
Does anyone have any suggestions on how to perform local SEO?
I just typed cache:www.hotvsnot.com and got a blank screen. When I looked at the code, I got information about www.hotvsnot.us and a small amount of content about adult themes. Definitely something fishy going on there.
From what I've seen, a lot more focus is being put on the onsite content these days. However, I have noticed that websites with keyword anchor text links still perform very well if done properly. General rules I would follow would include:
Keep keyword anchor text links below 10% of your overall link profile
Try to have unique anchor text for each of this type of link. Try to avoid using just the keyword, but a short phrase that includes the keyword or a variation of it.
Avoid sitewide links. I've seen examples of single websites using the same anchor text over 500 times, which destroyed search rankings.
%7E is the ascii code for ~. I'm not certain whats happening, but sometimes when copy and pasting rich text (like from word), the browser will see the ascii code and not the actual ~ character. This makes internal linking a bit awkward and can result in duplicate content, although I'm not sure if search engines would see this example as a problem.
I've noticed that you do not have rel="canonical" tags set up, it's worth putting them in just to be on the safe side.
I would also have a word with your developers as the urls are very long and messy. Also, try to avoid using upper case and avoid using non-alpha numerical characters (such as ~) except for hyphens. Something like http://www.arkwildlife.co.uk/straight-foods/sunflower-seeds/premium-sunflower-hearts would look a lot better and avoid problems like this occurring in the future. The only problem is that changing the url can temporarily affect search rankings and you must be sure to set up 301 redirects properly.
It's important to remember that the Moz toolbar is uses the moz crawlers so will not have as much information as Google's own crawlers. I would imagine that a lot of the product pages are newer than other pages in the website, so there is a good chance that they will not have any PA as the moz crawlers may not have crawled them yet.
The PA of sub-pages and product pages will mostly be down to internal linking. If you are worried about the pages carrying weight for the website, there are a few things you can do. First of all, run the website through a crawler tool like screaming frog or xenu. These will tell you if the pages can be crawled by search engines (if the pages do not show up in the results, they can't be crawled) and also tell you how many internal links point to the pages. Also, have a look through the website yourself, see how easy it is to access product pages without using search forms. As a general rule, no page should be more than 3 clicks away on an ecommerce site, but this is open to discussion.
Remember though that it is perfectly normal for a large website to have a tiered internal linking structure. If all the pages on the website had the same authority then it would look unnatural. I'm fairly sure Rand has done some talks about best practice for internal linking.
When I'm doing keyword research, one of the first things I do is go on competitor websites and see if they have the meta keyword tag. If they do, then it instantly gives me access to all the keywords that they're targeting and allows me to work on beating them. Deleting your meta keyword tag prevents your competitors from doing this to you.
When I'm doing keyword research, one of the first things I do is go on competitor websites and see if they have the meta keyword tag. If they do, then it instantly gives me access to all the keywords that they're targeting and allows me to work on beating them. Deleting your meta keyword tag prevents your competitors from doing this to you.
If you're a newbie, it might be worth using the on-page grader to make sure that all of your content is relevant. Make sure your content looks natural though and you do not over optimize or do any keyword stuffing. Once the content is sorted, you can get started on more advanced stuff.
You mention mistakes that you have made, if you told us what they were then we could make recommendations for you.
However, there is a LOT to SEO and some of it takes a long time to explain. I would recommend reading the beginners guide to SEO http://moz.com/beginners-guide-to-seo
You could try Majestic SEO and see if they show up there.
However, you need to be very careful when buying used domains with existing back links. What are you planning to use it for? If it's simply to 301 to your main site then you could be looking a manual action straight down the barrel. If you are planning on using it as your primary site, be careful that it doesn't already have a penalty.
Instead of who the links were from, have a look at what niche the old website served. If it is relevant, then you will have links from relevant websites and you might also inherit some relevant and diverse anchor text which will help. Use archive.org to have a look.
You can demote sitelinks so that they are less likely to appear but Google displays the pages that it considers to be most relevant to the searcher. Use the Mozbar to have a look at what the PA of the pages displayed are compared to the ones you want to display. You may need to have a look at your internal linking structure to improve the authority of the pages you want to display.
The general rule of using HTML for everything is one that I would follow. If you're unsure if something is crawlable, try downloading the web developer plugin for chrome http://chrispederick.com/work/web-developer/. Then disable javascript and plugins and refresh the page. Any content that can't be seen then probably won't be seen by search engines either.
Due to the management software used for a lot of our websites, the URLs for specific pages appears as:
brand.com/page/pagename/number
I want to add slug URLs so that they appear as:
First of all, do you think this is an effective use of my time and will it significantly benefit SEO in the long run. Secondly, these pages have developed a decent PA over time and I don't want to lose that. Will adding 301s for all the renamed pages be enough to not have a negative effect?
You need to think about why the website was performing well before the penalty. If it was spammy artificial links that were boosting your site, then removing the penalty means that they no longer help you. Therefore, your website is receiving considerably less link juice than before.
You may need to face facts that your link building efforts will probably need to start again from scratch.
I just typed cache:www.hotvsnot.com and got a blank screen. When I looked at the code, I got information about www.hotvsnot.us and a small amount of content about adult themes. Definitely something fishy going on there.
What about having a subtle call to action in there?
What you have written is fine, but it might be worth having a play around to see what works best. Maybe get 2 pages with similar performance in search, use one descriptive meta description like you already have and one with a call to action along the lines of "visit our website to find out about...". Then simply see which has the better click through rate.
Est. 2006, Maxweb is a full service digital design and SEO agency based in Birkenhead, Wirral, just a stone's throw from Liverpool city centre. We love SEO, and love Moz!
Looks like your connection to Moz was lost, please wait while we try to reconnect.