Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Hi Alan,
Have a look in google webmasters to see if the same 404's are occuring there. If so, they typically give you a list of page that generate the error page. It would be best to eliminate these issues at the source first as they will be offering a poor user experience.
As a prevented measure you could run these pages as a disallow, yes. But I fear that this will make it more difficult to detect these issues in future.
Dan
Hi Heather,
There is no correct answer to this, but personally I would work on a manageable set at a time. Try grouping them and create content and optimise accordingly. The unfortunate thing about working on many keywords at a time is that your effort becomes diluted.
Hope this helps.
Dan
Hi Edison,
Your question isn't very clear, but I will attempt to answer it the best I can.
You can target more the one keyword to a page OR multiple pages to a keyword. Generally I would only target one to three keywords per page depending on the similarity of the key phrases and the competitiveness. For best practice, generally one keyword per page is best.
On a first attempt I would only work with one page for any one key phrase. I have seen multiple pages from the one site rank on the first page for competitive keywords, but it is challenging.
Hope this helps,
Dan
Hi Bob,
In Cases like this, I take a step back and think is this being deceptive, is this already occuring on the web? The answer is no its not deceptive and yes its already happening in business today. I am sure there is an example where Google themselves are trying to capture top and bottom ends of the market (insert example here), but I'll give you this one.
In Australia a company called Coles Myer have two supermarket chains Coles and Bilo which capture the mid-to-top end and budget end of the supermarket market. They are shown as two completely separate companies, but I'm sure in an obscure about us page are still listed as part of the Coles Myer group.
The moral of my story, run both but ensure almost all aspects of the site are unique. Separate themes, separate dev teams, unique content, separate hosting etc. They should be seen to have been created by two separate teams to ensure success.
Hope this helps,
Dan
Hi Alan,
This is a two part question. For the search results I would add
Disallow http://www.practicerange.com/Search.aspx?m=*
To your robots.txt file. I would probably increase to the entirety of search (by removing the ?m= from the line above), but as I don't completely comprehend how your search section works start with the above. It very rare that search pages offer new content, generally they dilute other pages by duplication.
With your second issue I imagine there are clues in the aspxerrorpath variable. Although I couldn't get any combination of this string to render the below url didn't return to the 404 page like the rest did. Its the tilda (~) in the error string that I think offers the biggest clue.
http://www.practicerange.com/Golf-Training-Aids/Golf-Nets/
Hope this helps,
Dan
Hi Micro,
I would suggest practice is your best bet here, use the first free month to practice with the tools. They offer ease of use and a group of great people in the community to help with the more specific questions.
Dan
Hi Andy,
I would simply change the image file name and any alt text on the smaller image as it appears to be a duplicate image from a filename perspective (even though they are in different folders and different sizes). I would imagine Google sees them as duplicate and favours one, in this case the small image.
Hope this helps,
Dan
Hi Jayneel,
I would ensure the tracking code is on all pages during the eCommerce process, if you are using specific eCommerce tracking, ensure it also exists on the relevant pages.
Hope this helps,
Dan
Hi Matt,
I am going to guess that the only reason that this is occurring is that when the sites were separate the SEOMoz bots crawled your sites and that data is still in their index. Potentially you could use a secondary tool to check. Possibly http://ahrefs.com/
If this isn't the case I would question your method of running your 301's, but without eyeballing the site/s, I think this question is far to difficult to resolve with the limited information provided.
Dan
Hi Chandu,
Webmasters will be your friend on this one. I would recommend heading to the landing page of your site under webmasters then selecting Configuration->URL Parameters
The errors that are showing in your image have a number of parameters in them, try to determine which parameter is forcing the 404 and tell google what to do with it within the URL parameters section.
Once complete, try to test your change, download and store your 404 issues, clear the errors in webmasters and then wait until the site is recrawled.
Hope this helps.
Dan
Hi David,
Forget Exact Match Domains. Firstly, they no longer have the appeal (as they are not as easy to rank nowadays). More than one domain means that you need to work twice or multiple times as hard in order to get them to rank.
You are best investing the time in your site. Generate more content, run a news section or a blog. Guest post on like sites with engaging topics in your industry. Engage in forums, be known as the leader in your industry. One strong site is better than many small sites. You don't see apple run a site for tablets OR phones OR laptops.
Hope this helps,
Dan
Have you submitted a sitemap and do you have robots.txt uploaded to the root folder of your site?
Hi Courtney,
Are you suggesting that the sitemap itself is 404ing OR webmasters is in indicating your site has 404's of page that exist on your sitemap?
If it's the sitemap itself, can you navigate to it directly? Does it render in a browser?
If it's an error from a page on the sitemap, and the page currently renders there is a good chance it didn't at some stage. If that's the case you can ask google to recrawl it as an individual page, see;
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1352276
Hope this helps.
Dan
My first reaction was to disagree with you EGOL as a company had many different personalities. BUT, you have a very valid point. Unless you can get away from a level working relationship (all members being equal partners), issues will arise over time.
Good insight!
Hey Tiff,
I don't have a complete answer for you cos I am doing some insomnia q&a (it's 4.30am in Oz) but I really hope you are using robots.txt over robot.txt? This may be the reason is not working.
Can you tell us what the contents of the robot/s.txt is?
Dan
Hi Victoria,
I agree with Andy but would put it this way.
A well constructed site with a lot of great content and some well situated external links (on a variety of dominant and relevant sites in related industries) will out perform many site with little content all interlinking any day of the week.
The unknown here is, what are the group is going to blog or sell? If you can concept the site as the one entity and all the women are discussing/selling to a similar topic, fantastic. But, if the topics are a little abstract (say baby clothes, pets and mountain biking) I would not try running them as one, unless you can build the site to discuss all these topics in very fluid way (with the example I gave I very much doubt it would be possible, although I could be surprised).
The main takeaway is that I wouldn't simply run a directory of separate sites on the one domain without having a strong theme to seamlessly offer them as the one site with a variety of categories (for use of a better word). Also for usability I would also ensure they ran on the same theme.
If you can pull off the one site I commend you and think as a group you'll go far. Don't forget about Andy and I when you make your first million
Hope this helps.
Dan
Hi Matt,
It is fair and reasonable for a site to be decommissioned from time to time (or a page or two OR say a category of pages). As the page no longer exists, you can't ask the admin to remove your link (which is what Google expects you to do before disavowing a link), so I would simply ignore these links in your exported list. They will disappear from GWT and OSE over time as they are no longer in existence.
Disavowing links amongst other things helps Google determine a sites intent on the web and therefore they only want you to utilise this service if you no longer trust the relevance of the link and have had no joy requesting it's removal.
Hope this helps,
Dan
Hi Guys,
I edited my response above to clarify the confusion. Can you confirm Matt...
You are discussing a page on an external site that currently 404's. This page used to have a link to your site and subsequently was added to Webmasters and OSE in the past. Now this page no longer exists, but the reference still remains in Webmasters and OSE. Correct?
Dan
Hi Matt,
A link (to your site) that is no longer accessible from a page (external site) it was on due to a 404, no longer counts. Why does it still cause you concern, is it still listed on webmasters or OSE? If so, I would suggest this is because the page has not been recrawled, or attempted.
If you're concerned that the page may reappear, I would recommend writing to the admin of the site to request it be removed. Disavow should always be your very last option.
Hope this helps.
Dan
Sadly, although this site is big (over 1000 pages), wolfram alpha doesn't offer the tab that suggests subdomain...
I checked it on a number of sites (seomoz, harvard etc ) and it worked well.
Any other ideas?
Dan
Thanks again,
Running this on my Mac, it allowed me to run NSLookup but when it came to listing the subdomains it advised me that 'ls' was not a valid command.
Dan
Hi Steven,
Thanks for the quick response, what I should have pointed out was that I am currently proposing a Strategy for this site and do not have access to it.
Any further thoughts?
Dan
Hi Mozers,
I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information?
The only way I could think of doing it is to run a google search on;
site:example.com -site:www.example.com
The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www).
Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information?
Cheers,
Dan
Hi Stew,
Firstly dynamic URL's are often used to assist with searching OR filtering on a site. The practice is an inevitable part of offering flexibility to the user.
The issue with overly dynamic URL is that, for example;
If you have three elements to your URL EG/ http://test.com/search?element1=a&element2=b&element3=c and each element has 10 options, google will eventually crawl 10x10x10 pages = 1000 pages. Overly Dynamic URL's can create thousands of combinations of a URL very quickly and each URL will be seen as a unique page by Google.
Most of these pages will have duplicated content (although different products in different orders) on it. Depending on the way this section works, you may want to block the crawling of this search section using robots.txt.
I would also go to webmasters->YOUR-SITE->configuration->URL Parameters From here you can advise Google what to do with each element.
Hope this helps!
Dan
Hi Dana,
Without knowing anything about the site or the keywords, I would deduce that the url_1 page has best link profile from external sources for KW1...
You will be able to see this by running the three URL's through open site explorer. This will help you understand your offsite optimisation.
I would imagine the url_1 has more links than the other two AND/OR the sources of these links are more relevant (or sites with higher Page / Domain Authority) AND the ratio of keywords in the anchor text vs generic anchor text is better (would aim for 1 keyword back link to 3-4 generics).
At the end of the day, the reason SEO is such a busy and exciting industry is that there are many signals that help a page rank. If there was an exact answer to your question the industry wouldn't exist.
Hope this helps,
Dan
Hi Ken,
The quick answer, no...
Both SEOMoz and Adwords do all the work at their end and you only make the one request to these sites to obtain the data.
The CAPTCHA code occurs when multiple requests are run from the one IP address to Google search. This can happen naturally if many people are googling at the same time and your company only runs the one IP. It could also occur if you are using downloaded software to obtain SERP data directly from Google.
Hope this helps,
Dan
Hi,
Scrap Link Building... Treat it as Link Earning, period!
http://lmgtfy.com/?q=site%3Aseomoz.org+link+earning
Best Software? No software is best software!
Hope this helps,
Dan
Hi Semantique,
Hmm from a quick look I would look at the following.
Page Speed - Check out the home page on Pingdom http://tools.pingdom.com/fpt/#!/recfTiYWP/http://www.catwalkqueen.tv/
I would look to crunch the images for a start, they were very slow to load, the background image alone is almost 300kb.
Caching images would be recommended, ask you devs to combine Javascript files OR stylesheets where possible...
Dan
Hi Steve,
I think you should rephrase the question...
How can a responsive design harm SEO?
Typically a responsive design is driven by either USER AGENT or screen size... Both of these, if implemented correctly will not affect the way the Google Bot crawls the site. I would doubt there will be issues with 404's as the URL will be the same regardless of the device.
I would suggest it is poor implementation of the design.
Hope this helps!
Dan
Hey Andrew,
I would suggest having a category regarding small business and news within your blog, yes... I would also suggest having a category for your product, discussing a feature per post loosely marketing it to your visitors. Have a section to discuss new features you are working on (if you are happy to do so), also request feedback about other features are of interest. Ideally you want a few different sections, this will offer you a number of ideas to write posts on, but also it will diversify your content.
As for "taking great content from elsewhere on the internet and adding it to my blog" I highly recommend against it. However, you can re-write the content (perhaps, combine the details of more than one post).
Duplicate content is a world of hurt you don't want...
Cheers,
Dan
PS> If your question is answered by any of the great responses above I urge you to mark it as 'Answered'. You can mark more the one response.
Hey again,
There are five main SEO benefits to running a blog, but writing it for SEO purposes you are only going to take advantage of three and in fact one that will actually go against you.
They are (in no particular order);
I understand this is not an easy thing to hear (I pitch clients this all the time), but a quality blog is worth the time.
Your subject (online invoicing) may not be the most exciting topic, but there are still ways to making it exciting, think outside the box. Think of your audience, you only have to make it interesting for the people who are already looking for online invoicing. Make it interesting for them.
Hope this helps.
Dan
Hi,
It seems like you are running this more like a chore than a beneficial blog.
I would suggest posting when the post is completed (not necessarily all at the one time). I would also create content without being to concerned on length.
A blog should be a naturally occurring organic site OR component to your site. Post as often as you like, there are no rules to how often as long as you post occasionally to ensure fresh content.
Remember SEO is not simply a set of rules, focus on generating content to excite and/or educate your visitors and the rest will follow.
Hope this helps!
Dan
Hi,
You have listed this as a question, but haven't asked one...
So I have a question for you, Has the website that you are talking about ever posted a job on any job seeker sites?
Dan
Hi Mark,
I personally would run a clean 301 to the new site, then use something like $_SERVER['HTTP_REFERER'] (in PHP) to determine where the user came from. If it's from the old site, run a small banner in the header (like Hello bar) to advise your users of the change. This would be a lot cleaner for search engines and very user friendly.
Don't forget to transfer the sites worth using webmasters...
Hope this helps.
Dan
Hi All,
I am looking to start a eCommerce business and would like to centre the user engagement of the site around a forum.
Can anyone suggest a forum platform that adopts good SEO practice?
So far my considerations are;
Anyone used these with great success? Do you have another suggestion?
I am simply in the preliminary stage of sourcing something and am eager to here your thoughts...
Thanks in advance...
Dan
Hi Andy,
I would suggest running rel=next and rel=prev canonical tags with these.
Read more about it here http://yoast.com/rel-next-prev-paginated-archives/
Also although all-in-one seo is a good plugin I would suggest sticking with yoast. I would point out, do not attempt to run both at the one time either.
Hope this helps.
Dan
Hey Mase,
Yeh I'm going to fence sit on this one, but will offer this design tweak.
I think these look spammy, but I don't necessarily think you will be penalised for it. My suggestion would be to design this area to look less spammy. Consider a clickable drop down for each major city in the three sections of links or a show/hide section for each section etc. A comma seperated linkfest will attract a manual spam action, improve usability to prevent this .
I would also consider adding more unique content per page, because if you are having multiple pages where each listed item is shown more than once, the items content will offer little benefit.
Dan
Hi Dana,
I agree with Streamline, there will be a hidden issue in you site that it attempting to connect to an under formed link (a URL missing 'http://'). Given there is a number of them in one day I will guess this is happening in a templated page.
Have a look at;
It renders as a page.
The best course of action would be resolve it at the source. If you can pinpoint when this issue is due to occur next, have your developer get each page to append it's URL into the log at the beginning of the page. Then you should be able to determine where the issue is occurring. I am hoping you well see a discernible pattern.
Worse case scenario, possibly a canonical will work, OR create a REGEX redirect to handle this URL pattern in htaccess...
Hope this helps,
Dan
Another thing to consider is that requesting images from multiple sites will create a lag in load times. Most modern browsers will download multiple files in parallel from the one host. Multiple hosts will mean the page load will occur in series (not parallel) and this will create a slower load time.
Hope this helps!
Dan
Hi,
Unless he is intending selling of sections of the business in future, run everything on the one site.
Dan
Hi,
Yes, I still would. What I have suggested will only take 2 minutes and is considered best practice.
For non-www to www use;
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^yourdomain.com [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]
Dan
Hi Bhadresh,
Potentially yes... Subdomain are considered as separate sites and although most sites will render in both www and non-www versions by default, loosely the www version is still a subdomain.
On top of this dilemma is that when you are link building (or better still link earning) people will link to a variety of www and non-www URL's, this will affect how your site performs. By redirecting one to another for any given page you will ensure only one page exists.
Here is the example given by SitePoint for redirecting www to non-www, feel free to review the original post for other .htaccess rules...
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC]
RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Hope this helps!
Hi Doug,
Give this post a read, http://www.seomoz.org/ugc/the-beginners-guide-to-keyword-research-using-free-tools
It suggests a number of different tools to generate new keywords.
Also have a look for my comment about running a search to get Google to indicate which page best suits the keyword in question (by limiting the search to your site)
Hope this helps,
Dan
I like the idea of giving an example of price anchoring... Looked, liked, yet to convert...
Good sly marketing tactic, complimentary markets!
Hi Melissa,
I have seen this happen in the past, and it came down to the developer on the site, testing prior transactions where the analytics code would flag as the viewing of the invoices where rendered on the site.
Did you start running GA eCommerce code after the site was already live? Could it be possible that an older transaction (previously not recorded by GA) may be the cause? Can you view the transaction ID's to see if they are consistent as there may be a one or more not in the same ID range for the last month...
Hope this helps!
Dan