Hi Keri,
Just wanted to check on this and make sure you got everything worked out.
Thanks!
Anthony
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: President and Founder
Company: WebTree Media / NorthSEO.com
Website Description
Direction in Optimization. Enterprise Inbound Marketing and Search Engine Optimization services. Currently under development.
Favorite Thing about SEO
The risk. The reward. The science. The art.
Hi Keri,
Just wanted to check on this and make sure you got everything worked out.
Thanks!
Anthony
Hi Christine,
It could be an issue with the toolbar that's causing the PA of 1 to show up for the first (UK) url you mentioned, but the other two are likely correct.
At the moment you have two distinct pages:
www.ldnwicklesscandles.com/scentsy-uk
and
www.ldnwicklesscandles.com/scentsy-uk/
that have the exact same (duplicate) content, and both of which likely have links pointing to them. The same goes for every page on your site, so you're definitely losing some ranking power.
All of your pages should either end with a slash or not end with a slash... it doesn't really matter which one you choose, it just needs to be constant throughout the site. I checked your HTML and there are no Canonical tags, so I'd recommend asking SquareSpace to go ahead and add the first Rewrite I posted earlier to your .htaccess file. I'm sure they've had many similar requests, so it shouldn't be too much of a hassle. Once that's finished you should be good to go!
Let me know if you have any other questions.
-Anthony
SEOmoz updates it's Linkscape index approximately every thirty days. The last update was finalized on May 31, so if you made changes since then, you won't see that data in any SEOmoz tools/reports that pull from the the index.
You're in luck though, the next index update is scheduled for Wednesday (June 27), so you should have a clearer picture of your site's inbound links then.
FYI:
The SEOmoz API / Linkscape Schedule
Google can take quite a while to update it's index, and sometimes four months or more to reassess penalties for violation of their policies (spam / paid links). You might consider filling out this form, which could possibly give you a "fresh start" by devaluing the numerous inbound links w/ identical anchor text. I haven't had any experience with the results, but I've read it can expedite a return to Pre-Penguin organic rankings and traffic. It's just an option, I'm not suggesting that you submit the feedback form without further research.
Hope this helps and good luck!
Anthony
Hi Nick,
I agree with you, deleting the pages and starting fresh is probably the best bet. Once they've been deleted and return a 404 code, I'd go ahead and have Google remove them from the index via the GWT URL removal tool.
I'd say the risk of having those in-bound links sticking around outweighs the reward that 301s might yield.
Good luck.
-Anthony
If you're site is still serving both the URL with a trailing slash and a duplicate URL without it, that is definitely something that should be remedied as soon as possible, as your authority from external links is likely being split between the two URLS.
There's two ways to do this... 301 redirection and rel canonical tags. In my opinion a site-wide 301 redirect rule is absolutely the best solution to this problem. The code below will uses Apache's Rewrite Engine to redirect traffic and "linkjuice" to the URL with a trailing slash, which is what I would suggest.
Just add this to the .htaccess file located in the root directory of your domain.
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://yourdomain.com/$1/ [L,R=301]
_It's possible that the first two lines are already present, in which case you should paste lines 3 - 5 as they are above. _*****Replace 'yourdomain.com' with your domain.
You can be sure that the rule is working correctly by going to 'yourdomain.com/page' and confirming that you're redirected to 'yourdomain.com/page/'.
I also noticed in your question that you used both 'WWW.yourdomain.com' and 'yourdomain.com' when describing your issue. If you're seeing both WWW and non-WWW versions of your pages in search results, it's possible that you have another duplicate content issue, potentially allowing Google to index 4 duplicate versions of the same page... looks like you're good to go though ... I just checked and all non-WWW URLs are redirecting to the WWW versions, so just make sure you add the WWW to the last line of code above (**www.**yourdomain.com).
For other potential readers:
Site-wide non-WWW to WWW 301 redirects can be implemented by adding:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^yourdomain.com [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]
The opposite (WWW to non-WWW) can be implemented by adding:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC]
RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
*****Replace **'yourdomain.com' **with your domain and omit Line 1 if it is already present.
It'll take a little time for Google to account for the changes, but if you're redirected when you do the test I mentioned, you can sit back and relax.
Hope this helps and good luck!
I've been developing in Magento for several years, and I can tell you without a doubt that it's better to keep the H1 tag wrapped around the actual product name on your product pages. This should be default on all pages except the home page, where the H1 is wrapped around the IMG ALT for the logo in the header. I prefer to move the H1 on the main page so that it wraps actual relevant text, but it requires a little bit of editing of the Magento core, and like everyone has said, heading hierarchy doesn't count for much in Google's algorithm these days.
The best way to add specific keywords for product-level pages in Magento is via the Short Description, General Description, and the Product Tags. Tags are likely the most effective solution, as this functionality was designed for customers to make notes about products, so "White Business Shirt" would look natural there and make perfect sense.
You might also try very minimal internal linking from your homepage if you have a specific product you're attempting to boost your SERP ranking for. Something like "Our best selling < a h r e f = product url >business shirt< / a >."
To be honest, there's so many factors involved in correctly optimizing a Magento-based site, I'd recommend not wasting your time with H tags at all, especially masking and duplicating them.
You'll be much better off if you concentrate on optimizing your category hierarchy, writing unique product descriptions, dealing with duplicate content, configuring robot instructions, decreasing load times, etc...
Plenty to do... no need to mess w/ the H1 tag.
Hope this helps & good luck!
-Anthony
Me three...
Thinking out loud: Should he 301 a few of the existing pages with higher authority to the consolidated site, or do you guys think that would send an unnatural signal?
Totally forgot about the index updates! (Gotta learn to finish reading posts.)
Edit "argentdata.com/css/main.css" (changes in italics):
Line 107
#header #site-name a, #header #site-name a:link, #header #site-name a:visited,
#header #site-name a:hover, #header #site-name a:active {
text-decoration: none;
color: #CCC;
position: relative;
_ left: 100px; _}
Line 111
#nav {
font: bold 96% arial;
height: 2.09em;
margin: 0 105px 0 40px;
position: relative;
_left: 100px; _}
Line 151
#wrap {
min-width: 770px;
max-width:none !important;
margin: 0 auto;
position: relative; }
Line 152
#content-wrap {
position: relative;
max-width: 1200px;
_left: 100px; _}
This should match the root pages to the osCommerce pages.
-Anthony
Hi Keri,
This isn't a perfect fix but it should do the trick.
On line 377 of "argentdata.com/catalog/stylesheet.css" change
max-width: 1200px to max-width: none !important
#wrap {
min-width: 770px;
_max-width: none !important; _
margin: 0 auto;
position: relative;
}
You could also just delete the max-width value altogether, but just in case there's another width value somewhere else... might as well change it.
On line 348 add
position: relative; left: 100px;
#header #site-name a, #header #site-name a:link, #header #site-name a:visited,
#header #site-name a:hover, #header #site-name a:active {
text-decoration: none;
color: #CCC;
_position: relative; _
left: 100px;
}
Same thing on line 352... add** position: relative;left: 100px;**
#nav {
font: bold 96% arial;
height: 2.09em;
font: bold 96% arial;
margin: 0 105px 0 40px;
position: relative;
_ left: 100px;_
}
/* Hope this helps!
- Anthony */
I'd recommend using a Javascript Lightbox extension to display the certificate in JPG format. That will keep your users on the same page and allow them to zoom and pan over the COA without opening a new window. Free Lightbox extensions are available for most eCommerce platforms, and aren't too hard to implement.
Once the user purchases a product, I'd send the certificate to the buyer in PDF format so it can be easily printed, which I'm sure most customers would appreciate.
Hope this helps.
Thanks!
Anthony
The best way to handle this is via the URL Parameters Setting in Google Webmaster or a robots.txt file.
Google added this functionality to handle the exact issues your'e describing, so there's no need to drastically change functionalities which would likely require editing core files in your CMS.
If you click on URL Parameters under Site Configuration in Google Webmaster you will find a list of queries and for each one there available options that instruct google as to how to handle these pages.
To do this:
1. Click Edit for the Paramater you'd like to configure (i.e. course, cooking, etc).
2. In the Dropdown Menu, select Yes. Changes, reorders, or narrow page contents.
3. Choose the option that best describes how the parameter affects the page content.
4. Choose how GoogleBot should crawl these pages.
Dynamic websites are very common these days and this tool is designed by Google specifically to handle parameters in the best possible way and allow Google to understand the URL structure of your site.. The "Don't have dynamic URLs solution" isn't a solution at all, as many modern functionalities rely on dynamic URLs, such as layered navigation in Magento or other eCommerce platforms. How do you suggest filtering products by price, size, color, etc without creating dynamic URLs? These functionalities IMPROVE user experience and navigation. The text in the address bar isn't always the important factor when a user is navigating a site.
Don't overthink it.
Take advantage of the functionality and only de-index pages that are causing duplicate content problems. If you notice specific dynamic URLs are appearing in SERPs too often then create a 301 redirect from that dynamic URL to a landing page with more user friendly URL.
Hope this helps.
Anthony
Hi Jason,
If you've implemented the 301 redirect to the non-www version of your site, then you'll get credit for links that are pointing to the www version.
Open Site Explorer uses SEOMoz's Linkscape Index to provide the data you're seeing, and it hasn't been updated since February. If you've done the 301 since then, OSE will still be showing the old data. Remember, that OSE doesn't display the real-time data for the site, it's based on the last index update.
The next update is scheduled for April 27th, so if you'll check back then you should see that all your links are now pointing to the non-www version of the site.
You can check out the Linkscape Index update schedule here.
Hope this helps.
Anthony
Hi,
I think the reason you're getting this error is that your entire site is essentially "blocked" from agents.
It doesn't look like you've disallowed the path in the report via robots.txt, so I think it's an issue with the section of your document not containing instructions for robots to index and follow the site.
The perfect goes like this:
<title>Title of Page | Free iPad when you buy Cheap Viagra Online Pharmacy</title>
Apple Tablet save money on the best online pharmacy price of Viagra erectile dysfunction ED treatment pill capsule sample. " />
apple steve jobs ed erectile dysfunction best price on viagra pills pharmacy cheap discounts cialis pill Viagra Pill Viagre Viagrah capusle Canadian Viagra save money ipad ipad2 new ipad free best website for viagra" />
Kidding!
I think it's just the last line you'll need to add that let's robots know to index and follow the page:
You might possibly want to use the entire URL in the first line...
content="0; URL=http://yoursite.com//shop/searchresult.seam"
But I'm not positive about that. Like Red Clay said, the 0 second refresh could definitely be the problem too.
Hope this helps!
Anthony
Hi Pete,
Good question.
First thing, don't worry too much about keywords in your meta descriptions. Google will make them bold, which might draw some attention to your listing on the results page, but keywords in meta descriptions don't actually improve rankings.
Instead, use the meta description area to entice the searcher to visit your site. What set's you apart from the competition... do you offer free shipping?... any special promotions? Using the Meta Description as an informative snippet and a call to action is much more effective in increasing CTR (Click Thru Rate) and Conversions than packing them with keywords.
Secondly, some pages (like the Shopping Cart or other customer specific pages), you don't really want showing up in Google, so it's best to instruct Google not to index these pages (either via the robots.txt file or meta-robots tag). If Google doesn't index the page, then there's no need for the Meta Description.
Lastly, I'd recommend investing the time in writing good Meta Descriptions for the pages that you anticipate will be receiving the most impressions and traffic, and use a default description of the site for the pages that will likely rarely appear in results.
Hope this helps!
Thanks,
Anthony
HI Pol,
You can check out the Linkscape Update Schedule here:
http://apiwiki.seomoz.org/w/page/25141119/Linkscape%20Schedule
Looks like April 27th is the day. You're not the only one holding your breath, I promise! But with the next index including over 100 billion URLs for the first time, I'm sure SEOMoz has a few kinks to work out.
Also here's a very in depth explanation of the reason for the delayed update, which was originally planned for April 6th:
http://www.seomoz.org/blog/linkscape-index-delay-explained
Hope this helps.
Thanks,
Anthony
Hi Guys,
I've got a friend / client / business associate who's website I helped develop. It's a three letter dot-com, so good trust, and an eCommerce site, so lot's of pages.
When I launched my new site about 6 weeks ago I put "Official IT Partner of MySite.com" in the footer. No keywords in the anchor text, just the domain URL...
There are no other external links like that on the site whatsoever, and I haven't been hit by Penguin. I'm ranking well for local targeted keywords a few weeks after launch, and traffic continues to increase...
I am worried that Google will see this is unnatural, but I've received no warning or experienced any decline in rankings. There's about 2800 pages linking from the site to my site, all in the footer of course.
Would it be better to remove the link from the footer and add it just to the home page and a couple of other high authority pages, or should I leave it be. It's not "unnatural", I am affiliated with the site and work in partnership with the site, but it does fit that profile.
I'm thinking about removing the footer link and adding a small graphic on the home page of the linking site which links to my root domain, with a couple of broad keyword anchored links in a description underneath that also link to relevant pages on my site...
What do you think?
2800 links w/ my URL as anchor text from high Domain Authority / Low Page Authority pages (the homepage and a few other pages have decent authority) to my root domain
OR
Three different links from one High DA/ High PA homepage (one image alt, two anchored w/ broad keywords) to three different pages on my site.
Again, there are no other site-wide external links on the domain, and I'm pretty sure I escaped the Penguin.
Looking forward to hearing the different points of view.
Thanks,
Anthony
I'm getting the same issue.
The problem seems to be a "referer exploit". I found this on a Google Groups forum:
From Google Groups (http://groups.google.com/group/Google_Web_Search_Help-UsingWS/browse_thread/thread/2219b91fc28ecb51?pli=1)
_"This sounds like the "referer exploit" where web-sites are hacked and files, such as .htaccess have been modified. When trying to access specific sites through a link in search results, the redirect occurs. When typing the URL manually into the address bar, the redirect does not occur. _
_Here is a discussion on this from the Internet Storm Center: _
_http://isc.sans.org/diary.html?storyid=5150 _
_These are discussions from Google Web Search Help: _
_http://groups.google.com/group/Google_Web_Search_Help/search?hl=en&gr... _
_This is a similar discussion in the Webmaster Help Group: _
_http://groups.google.com/group/Google_Webmaster_Help-Indexing/browse_... _
_And here is one from Ask Dave Taylor: _
_http://www.askdavetaylor.com/how_people_hack_apache_web_server_rewrit... _
_Hopefully, these documents will help you resolve the issue with your _
_web-site (and/or provide enough information for the web-hosting _
_provider to do so.) _
_Another user recently reported that it was necessary to scroll down _
_beyond the blank lines that had been inserted at the top of _
the .htaccess files, to find the code that was added. "
Hope this helps.
Hi Luke,
I don't believe that this would be considered over optimization but there's probably a better solution for your titles for a couple of reasons.
I'd recommend XXXX Pizzeria | Pizza Restaurant in Birmingham AL and Coventry AL
XXXX = Brand Name
"XXXX Pizzeria is conveniently located near Birmingham and Coventry at 1234 Whatever St, Birmingham, AL. Delivery, Carryout, and Dine In available... so on and so on.
Hope this helps.
Thanks!
Anthony
Hi David,
Great question. I'm not sure if that's possible or not, I know there are a lot of options via Linkscape's API but you're talking about Crawl Reports and that's a separate issue entirely. I'd suggest asking the SEOMoz Staff about it... I'd love to know my self.
To submit a ticket: https://seomoz.zendesk.com/home
Contact SeoMoz Help: help@seomoz.org
If it's not currently possible you might want to add it to the Request a Feature page.
Thanks,
Anthony
Two directions I'd go:
If you're looking to attract young writers:
Journalism, Freelance Writing, etc
If you're looking to attract readers:
News, Entertainment News, Opinions, Columns, Blog, etc.
You're looking at very broad keywords, and you're right, the individual article pages won't really require much keyword optimization. However for your home page and root domain in general, you should do some research about what phrases readers or freelance writers search for when submitting or reading news, and concentrate on those.
Hope this helps.
Thanks,
Anthony
Entrepreneur / Web Designer / Rockstar / Digital Ninja / Graphic Designer
Looks like your connection to Moz was lost, please wait while we try to reconnect.