Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: Technical SEO Strategist
Company: Stew Art Media
Website Description
Check out Jim's great 10min Video sessions every Wednesday, a must see!
Favorite Thing about SEO
Results!
Hi Alan,
It seems like you are only targeting the search page, I would suggest adding a * at the end to capture all variations.
Dan
Hi Alan,
Have a look in google webmasters to see if the same 404's are occuring there. If so, they typically give you a list of page that generate the error page. It would be best to eliminate these issues at the source first as they will be offering a poor user experience.
As a prevented measure you could run these pages as a disallow, yes. But I fear that this will make it more difficult to detect these issues in future.
Dan
Hi Heather,
There is no correct answer to this, but personally I would work on a manageable set at a time. Try grouping them and create content and optimise accordingly. The unfortunate thing about working on many keywords at a time is that your effort becomes diluted.
Hope this helps.
Dan
Hi Edison,
Your question isn't very clear, but I will attempt to answer it the best I can.
You can target more the one keyword to a page OR multiple pages to a keyword. Generally I would only target one to three keywords per page depending on the similarity of the key phrases and the competitiveness. For best practice, generally one keyword per page is best.
On a first attempt I would only work with one page for any one key phrase. I have seen multiple pages from the one site rank on the first page for competitive keywords, but it is challenging.
Hope this helps,
Dan
Hi Bob,
In Cases like this, I take a step back and think is this being deceptive, is this already occuring on the web? The answer is no its not deceptive and yes its already happening in business today. I am sure there is an example where Google themselves are trying to capture top and bottom ends of the market (insert example here), but I'll give you this one.
In Australia a company called Coles Myer have two supermarket chains Coles and Bilo which capture the mid-to-top end and budget end of the supermarket market. They are shown as two completely separate companies, but I'm sure in an obscure about us page are still listed as part of the Coles Myer group.
The moral of my story, run both but ensure almost all aspects of the site are unique. Separate themes, separate dev teams, unique content, separate hosting etc. They should be seen to have been created by two separate teams to ensure success.
Hope this helps,
Dan
Hi Alan,
This is a two part question. For the search results I would add
Disallow http://www.practicerange.com/Search.aspx?m=*
To your robots.txt file. I would probably increase to the entirety of search (by removing the ?m= from the line above), but as I don't completely comprehend how your search section works start with the above. It very rare that search pages offer new content, generally they dilute other pages by duplication.
With your second issue I imagine there are clues in the aspxerrorpath variable. Although I couldn't get any combination of this string to render the below url didn't return to the 404 page like the rest did. Its the tilda (~) in the error string that I think offers the biggest clue.
http://www.practicerange.com/Golf-Training-Aids/Golf-Nets/
Hope this helps,
Dan
Hi Micro,
I would suggest practice is your best bet here, use the first free month to practice with the tools. They offer ease of use and a group of great people in the community to help with the more specific questions.
Dan
Hi Andy,
I would simply change the image file name and any alt text on the smaller image as it appears to be a duplicate image from a filename perspective (even though they are in different folders and different sizes). I would imagine Google sees them as duplicate and favours one, in this case the small image.
Hope this helps,
Dan
Hi Jayneel,
I would ensure the tracking code is on all pages during the eCommerce process, if you are using specific eCommerce tracking, ensure it also exists on the relevant pages.
Hope this helps,
Dan
Hi Andy,
I would suggest running rel=next and rel=prev canonical tags with these.
Read more about it here http://yoast.com/rel-next-prev-paginated-archives/
Also although all-in-one seo is a good plugin I would suggest sticking with yoast. I would point out, do not attempt to run both at the one time either.
Hope this helps.
Dan
Hi,
It seems like you are running this more like a chore than a beneficial blog.
I would suggest posting when the post is completed (not necessarily all at the one time). I would also create content without being to concerned on length.
A blog should be a naturally occurring organic site OR component to your site. Post as often as you like, there are no rules to how often as long as you post occasionally to ensure fresh content.
Remember SEO is not simply a set of rules, focus on generating content to excite and/or educate your visitors and the rest will follow.
Hope this helps!
Dan
Hi Chris,
Internal linking is an important but not over beneficial part of optimising your site.
Typically a good navigation, possibly a meaningful footer (with links) and breadcrumbs can be helpful for a users navigation. These approaches account for most of the internal linking on a site.
When running with these approaches, I would always recommend text links (avoid images where possible) and ensure above all it gives the user the best experience.
For your example above, writing Pink widgets | Blue widgets | Green widgets in the nav might take up to much real estate and look a little sloppy, try a drop down with the main category 'widgets' and run the colours as sub categories. Having a site structure www.example.com/widgets/blue will help to define widgets are an important concept on your site while also highlighting that you have Pink widgets | Blue widgets | Green widgets. It will be assumed that for the product pages in this example, engaging unique content is available.
That said, if you see the need I would also encourage you to have internal link within your content, WHERE RELEVANT. I have seen to many sites, simply go through their content and pick out the popular keywords linking all over the place. If it helps the user (possibly by defining an unusual term OR refering to a service OR product described on a different page) it's worth doing.
One of my pet hates is finding a keyword on page that links to itself (same URL) because it is a keyword that is being targeted. As a user it's frustrating and personally I immediately leave sites running this practice.
You don't have to continually link to your desired page with the same keywords, in fact it's discouraged. Google are becoming increasingly better at understanding intent, therefore do what is best for your visitors and you will ensure that your site enjoys longevity in search rankings...
Best of success,
Dan
Hi,
I assuming you are talking about the keyword tool found incorporated in Google Adwords, https://adwords.google.com/o/KeywordTool
This, although an estimate, is the most accurate source of data of keyword search volumes. I have compared these values to impressions given for consistently ranking keywords in webmasters and they match up.
Make sure when you run these searches in adwords, you have your desired country selected, and I would suggest selecting "exact match" instead of broad. The other thing to consider when examining this data is that the search results can be seasonal. The data provided is simply in the last month, so a trending topic (Oscar Pistorius) OR a seasonal one (Christmas Shopping) may not give the results you might think. Google Trends can help you realise this...
Dan
Hi Victoria,
I agree with Andy but would put it this way.
A well constructed site with a lot of great content and some well situated external links (on a variety of dominant and relevant sites in related industries) will out perform many site with little content all interlinking any day of the week.
The unknown here is, what are the group is going to blog or sell? If you can concept the site as the one entity and all the women are discussing/selling to a similar topic, fantastic. But, if the topics are a little abstract (say baby clothes, pets and mountain biking) I would not try running them as one, unless you can build the site to discuss all these topics in very fluid way (with the example I gave I very much doubt it would be possible, although I could be surprised).
The main takeaway is that I wouldn't simply run a directory of separate sites on the one domain without having a strong theme to seamlessly offer them as the one site with a variety of categories (for use of a better word). Also for usability I would also ensure they ran on the same theme.
If you can pull off the one site I commend you and think as a group you'll go far. Don't forget about Andy and I when you make your first million
Hope this helps.
Dan
Hi David,
Forget Exact Match Domains. Firstly, they no longer have the appeal (as they are not as easy to rank nowadays). More than one domain means that you need to work twice or multiple times as hard in order to get them to rank.
You are best investing the time in your site. Generate more content, run a news section or a blog. Guest post on like sites with engaging topics in your industry. Engage in forums, be known as the leader in your industry. One strong site is better than many small sites. You don't see apple run a site for tablets OR phones OR laptops.
Hope this helps,
Dan
Hi Laura,
I would suggest setting up Google webmasters and checking the crawl errors. Google will already be monitoring this so once you have set it up, the errors will already be listed. My feelings are that the urls may have changed in a subtle way eg. Www now non-www or html files now php... Webmasters will give clues. Also while your there I recommend submitting a sitemap.
Best of luck,
Dan
Hi Stew,
Firstly dynamic URL's are often used to assist with searching OR filtering on a site. The practice is an inevitable part of offering flexibility to the user.
The issue with overly dynamic URL is that, for example;
If you have three elements to your URL EG/ http://test.com/search?element1=a&element2=b&element3=c and each element has 10 options, google will eventually crawl 10x10x10 pages = 1000 pages. Overly Dynamic URL's can create thousands of combinations of a URL very quickly and each URL will be seen as a unique page by Google.
Most of these pages will have duplicated content (although different products in different orders) on it. Depending on the way this section works, you may want to block the crawling of this search section using robots.txt.
I would also go to webmasters->YOUR-SITE->configuration->URL Parameters From here you can advise Google what to do with each element.
Hope this helps!
Dan
Hi Edison,
Your question isn't very clear, but I will attempt to answer it the best I can.
You can target more the one keyword to a page OR multiple pages to a keyword. Generally I would only target one to three keywords per page depending on the similarity of the key phrases and the competitiveness. For best practice, generally one keyword per page is best.
On a first attempt I would only work with one page for any one key phrase. I have seen multiple pages from the one site rank on the first page for competitive keywords, but it is challenging.
Hope this helps,
Dan
Hi,
Using WordPress I would recommend WordFence. If the DDOS attack is simply an attempt to overload your server with bogus requests there is not a huge amount that can be done as it act sin a similar manner to gaining a lot of traffic from say a marketing exercise.
But if the DDOS is attempting to hack into your site, there are a number of preventative measures that the plugin does to ensure it is not an easy task.
Firstly ensure all your plugins are up to date along with the WordPress build. Disable any plugins that you are not 100% sure of.
Upon installation of the WordFence plugin, I would highly recommend going to options -> Login Security Options and changing
Lock out after how many login failures & Lock out after how many forgot password attempts TO 5 attempts max
AND
Amount of time a user is locked out TO 2hrs minimum
Also by adding your email at the top of the options you will be alerted when anything occurs on your site (including legitimate logins) so that you can make informed decisions.
Oh, and unless you are actually serving the site up from you Mac OR are concerned that the attacks you have experienced are coming from your machine (with a DDOS, I would find it unlikely), Malware software will not be helpful in this scenario.
Dan
I don't like bios
Looks like your connection to Moz was lost, please wait while we try to reconnect.