Sorry I can't give you a direct answer. I would suggest running a test. Upload a new one to see if it overwrites old date. The maybe go on to a small group to see how that works. If everything looks good... upload away!
Posts made by STPseo
-
RE: Google Places Local Submission
-
RE: Capital Letters in Title Tag, or not?
I don't think Google will treat a capitalized keyword any different for ranking and such. In this case I would really focus on what looks best to the customer.
Remember your first priority should be usability and small tricks usually don't work.
-
RE: How to change a url for google analytics account
Is it a new domain? You could try and use the same tracking code for the new domain. If that doesn't work you may have to start new tracking. The old domain info will still be around if you need to compare to the new one.
-
RE: Opinion regarding SEO Sabotage
In this case you may want to shoot Google a note with some examples and see what they say. At the very least you'll have some documentation that it was something you were worried about.
-
RE: Corporate Client Won't Approve Landing Pages (of any sort)
Landing pages.... or doorway pages are a no no. You should move away from them all together if you are trying to rank. Packing a page with keywords to get ranking on it can get you penalized and dropped from Google.
Instead make a site that include relevant content and include your relevant keywords in that content. Make your page titles and page descriptions relevant to the content on the page and the keywords included. Organize the site in a logical manner and use H1, H2, and H3 tags but not too heavily.
That's it in a nutshell! Hope it helps!
-
RE: Why are old versions of images still showing for my site in Google Image Search?
I don't think it would be too bad. A cleaner name would be better but having the new images show is important too. More important in my opinion.
-
RE: To block with robots.txt or canonicalize?
I think the canonical idea is better than blocking the pages all together. Depending on how the site is laid out you may try and make the pages more specific to location being talked about. Maybe adding header tags with the location information as well as adding that info to the page title and meta-description. If it is not too time consuming, I'd try and make those pages more unique especially since you might be getting searches based on a location. Location specific pages may help in that regard.
-
RE: Google Maps results doesn't show my site url but rather the maps url, why is this?
Hi.
Maybe it just takes a bit to cycle through and update. I see the URL now when I search for "movers novi mi.
<cite>www.mynovi****movers.com</cite>
-
RE: Non existant URLs being generated in index
If you search for this in Goggle: site:www.applicablejobs.com
You see 43 URLs and none of the bad ones.
-
RE: Non existant URLs being generated in index
Okay. Well in that case I cannot speak to why they are happening in the first place. To keep them out of the index you could have exclude the entire /jobs/ directory using the robots.txt. If the /jobs/ directory is needed then you'll have to track down the source of the URL generation. Sorry I can be of more help.
-
RE: Use of the tilde in URLs
We use tildes pretty heavily on our new site. They seem to be okay with Google. However I did not want to use them because some foreign keyboards do not include the character... like Mexico.
So... do folks in Mexico type in our URLs by hand? Probably not common... but it is a potential problem. It is missing from other keyboards as well.
We use the tilde because we think it helps break up words we do not want to be seen as "together" in a string. All my product URLs have the product name, separated by dashes, then we use the tilde then comes the product number. We think it may help Google see the product title as a complete string and not include the product number. Not sure if it works or not.
-
RE: 400 errors and URL parameters in Google Webmaster Tools
The easiest way would be to add a disallow line to your robots.txt file.
From Google:
- To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):```
User-agent: Googlebot
Disallow: /*?
More info: http://www.google.com/support/webmasters/bin/answer.py?answer=156449
- To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):```
-
RE: Collapsible FAQ guides
I agree, if you can see the content in the code and it is available to the customer... I think you will be okay. I think hidden content means content someone on the outside cannot see or access.
-
RE: How important are unique titles and descriptions?
It is very important. We also have dynamic pages... a ton of them. However we found a solution by bringing other data already on the page into the description and title. Let's say that each page offers a different vacation package. You should see if the name of the package can be imported into the title and description. Since the data is already there... it can be done.
Come up with some strategy that works with the data you have. For example... with our product pages, we did something like: [product title] - save 35 - 70% at Sierra Trading Post.
-
RE: Non existant URLs being generated in index
Is your domain "www.applicablejobs.com"? If not, it sounds like you may have been hacked and someone added some code snippet to your website. I host some personal sites on Network Solutions and one day I found some strange code snippet on just about every page of the sites I run. After removing the code I had to upload every page again but only after changing all my passwords.
As for removing them? Google has a tool to remove them. However if this is not your domain - you may want to email Google and inform them of the malicious happenings.
-
RE: Why are old versions of images still showing for my site in Google Image Search?
Google may have the images cached and not realize you have new ones. One rule of thumb I try to promote is to always use unique image names. You may find your customers also see old images if an image of the same name already resides in the browser cache. I encourage our marketing folks to add a date stamp to the end of an image so it will be seen as new. Versioning will work as well.
-
RE: Root or subfolder campaign for e-commerce
I would campaign on both I guess. You want to build your main brand while also increasing awareness on the brands you carry.
In SEOmoz you can enter the root, but you whole site will be tracked. Once you decide what keywords you are trying to rank on, keep your eye on what page is being ranked an go from there. So... if you track the keyword "cruffy" - and you rank in the top 50... see what page that is ranking. Maybe it is your homepage, or a category page, or even a specific product page.
I would concentrate the categories to promote those specific brands. However, don't ignore you homepage and remember to build your site name up as well.
So... get your site set up... in a good logical simple structure. Then have SEOmoz crawl it and fix any errors and consider their suggestions. Enter the words you are trying to rank on and see how you do. Make adjustment to try and improve ranking.
Anyway... SEOmoz will crawl your entire site including all folders. You will have to track sub-domains separately but not sub folders.
-
RE: Root or subfolder campaign for e-commerce
If you also care about the branding of your root domain (store name) you would want to use your root domain and sub folders off that. If you add more brands and make separate domains for each you'll find yourself managing multiple websites.
As for using the competitor analysis tool in SEOmoz, you can only use the root domain - www.competitor.com.
Also if you can take the "all clothing" layer out... that would be beneficial to your SEO.
Hope this helps.
-
RE: If we add noindex to a subdomain, will the traffic to that subdomain still generate domain authority for the primary domain?
Not totally sure. You may find it will help the root domain if the sub-domain links back to the root domain and you still let the pages get followed.
-
RE: Solving Printer Friendly version
In summary: Canonical the print friendly page back to the main page. If you don't want to use canonical and want to keep it as it's own page - make sure you have a unique title tag and description - you could add "Print Version" to them.
I'd recommend using the canonical however as that will probably help page A and you want have A and B competing with each other.
-
RE: Meta Description Template
Hi. You want to make sure they are unique and the interesting stuff is up front. These descriptions can help with click through as that is what the search person will first see after the title.
I'd suggest something like: [blog title] - [blog category] - by [blog author] - [blog date]
This will keep them unique and if it gets truncated the important stuff is up front. With this in mind you could tell them bloggers to make sure they have relevant and catchy blog titles to garner that click!
-
RE: 61.45% Shopping Cart Abandon Rate - How to troubleshoot?
A couple things we looked at when it comes to abandoned carts was shipping and tax. We'd often get comments that our shipping rates were too high. If you wait until the last minute to show them these additional costs... you can expect more carts to be abandoned.
Also - if you run promos you want to make sure the customer can see any discount. If there are problems with a promo or if you apply the discount after the order is finalized, the customer may think it is not working and will abandon the cart.
Another thing - if you suspect orders are being dropped during authorization - if possible try and pull the data from the cart as soon as the button is clicked. Make this a server side collection so you don't need to wait for or relay on any third party connection. Then you can compare your data with the final order data to make sure they are all going through.
Hope this helps!
-
Not using a robot command meta tag
Hi SEOmoz peeps.
Was doing some research on robot commands and found a couple major sites that are not using them.
If you check out the code for these:
http://www.zappos.com/product/7787787/color/92100
You fill not find a meta robot command line.
Of course you need the line for any noindex, nofollow, noarchive pages.
However for pages you want crawled and indexed, is there any benefit for not having the line at all?
Thanks!
-
RE: How best to optomise a .co.uk to rank on google.com.au
i think that you have to have a physical business presence in Australia to be able to get an .au domain name. I know that we have not been able to buy one.
-
RE: New, Used, Refurbished Ecommerce Products
Yeah... I think that is the best approach as people search for the fruit first and fore-most then would probably decide on the refurbished or not.
-
RE: High number of items per page or low number with more category pages?
In this case I would focus on usability first. You may find that you help SEO but your conversion rates go down. I would do A B testing with both formats and use the one that performs the best. Customer usability is more important in my opinion so find that out first and optimize the best performing layout.
-
RE: Site speed - query
There are quite a few factors that can be measure... initial response time... how long it take the page to render... how long it takes the entire page to load... and many other factors can be measured.
Here is a tool we use that may get you started at being familiar with what can be measured and hopefully improved.
This tool can measure any page on your site and you can also plug in competitor URLs to measure yourself against them. http://www.webpagetest.org/
I get a median of 5 runs because load time can fluctuate depending on location, time of day, and the general mood of the universe.
-
RE: May not have a /path after the host
I think you are limited to a root domain or subdomain and not specific directories. You can only have www.competitor.com or subdomain.competitor.com.
-
RE: Duplicate content on video pages
The first thing would be to insure the meta data page titles and page descriptions are unique. It's hard to give you an example without knowing more about your site. But for example if you have a page that is "Subject 1" you could have the corresponding video page be "Video of Subject 1". Same for page description. If you make those unique that will help.
You can also consider some type of pagination. 'Video 1", "Video 2", etc. or "Video - pg 1", "Video - pg 2" etc.
-
RE: RSS feeds- What are the secrets to getting them, and the links inside then, indexed and counted for SEO purposes?
We use an RSS feed for new product lists. We may have some lag time before a new product gets put in a category and able to be browsed to on our site. The RSS feed gives a few days head start getting these new products into the search engines. We redirect all RSS links back to the main site links that include canonical tags for the main product pages.
-
RE: New, Used, Refurbished Ecommerce Products
I would focus on the fruit. I'd have fruit categories and put the refurbished on the top of that structure. People are searching for the fruit first and foremost and not "refurbished" in general. I would not canonical back the new product URLs to the refurbished ones. Keep them pure. However to push the refurbished fruit you could add links and advertising on the new fruit back to the refurbished fruit pages.
I would think you would want to link build back to the category unless the content is real specific to a model. I am thinking the category will be more permanent and you wouldn't want a link back to a product that is sold out or no longer available.
Hope this helps - good luck!
-
RE: Robot.txt pattern matching
John, The article was a real eye-opener!Thanks again!
-
RE: Robot.txt pattern matching
Great point! I will remember that. However I have both the disallow line in the robots.txt file and I also have the noindex meta command. Yet Google shows 3000 of them!?!?!?!
http://www.google.com/search?q=site%3Awww.sierratradingpost.com+keycodebypid
-
RE: /$1 URL Showing Up
If you can't find them, you could put a disallow in your robots.txt files to keep them from being crawled.
-
Robot.txt pattern matching
Hola fellow SEO peoples!
Site: http://www.sierratradingpost.com
robot: http://www.sierratradingpost.com/robots.txt
Please see the following line: Disallow: /keycodebypid~*
We are trying to block URLs like this:
http://www.sierratradingpost.com/keycodebypid~8855/for-the-home~d~3/kitchen~d~24/
but we still find them in the Google index.
1. we are not sure if we need to specify the robot to use pattern matching.
2. we are not sure if the format is correct. Should we use Disallow: /keycodebypid*/ or /*keycodebypid/ or even /*keycodebypid~/?
What is even more confusing is that the meta robot command line says "noindex" - yet they still show up. <meta name="robots" content="noindex, follow, noarchive" />
Thank you!
-
RE: Best approach to launch a new site with new urls - same domain
Thank you very much for the insight!
-
RE: Best approach to launch a new site with new urls - same domain
Sorry this is so confusing and thank you so much for your responses... there would be no subdomain when we do the soft launch... it would be http://www.sierratradingpost.com/Mens-Clothing.html (old site) vs http://www.sierratradingpost.com/mens-clothing~d~15/ (new site)...
-
RE: Best approach to launch a new site with new urls - same domain
We do plan to do that... it is just since we plan a soft launch we will essentially have 2 sites out there. We are wondering when to remove the noindex from the new site. We will have 2 sites for about a month... should we let the bots crawl the new site (new urls, same domain) only we we take down the old site and have the 301's or let Google crawl earlier to get the new site a head start on indexing.
-
RE: Best approach to launch a new site with new urls - same domain
We would drop the subdomain - so we would have 2 "Men's Clothing" department pages - different URLs, slightly different content...
-
RE: Best approach to launch a new site with new urls - same domain
But the URL structer is different... does that matter?
-
RE: Best approach to launch a new site with new urls - same domain
SInce all of the URLs except for the homepage - what do you think about letting the new site get crawled maybe 2 weeks before it is 100% launched? We would have some duplicate content issues but I am hoping this would give us a head start with the new site.... then when we go 100% we add the 301's and new sitemap. It is my understanding we will be dropping the sub domain for the soft launch.
Thank you so much!
-
RE: Best approach to launch a new site with new urls - same domain
So with the service - the new site is not crawled until we launch it?
-
RE: Best approach to launch a new site with new urls - same domain
The new site is beta.sierratradingpost.com where we will be dropping the beta. On the old one has catalog departments... ie Men's Classics, which, at this time, are not being carried over to the new site. I guess we are wonding when we should allow the robots to crawl the new site?
-
Best approach to launch a new site with new urls - same domain
We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results.
The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites.
Except for the homepage the URL structure for the new site is different than the old site.
What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues?
Here is what we got back from a Google post which may highlight our concerns better:
Thank You,
sincerely,
Stephan Woo Cude
SEO Specialist