Thanks, Sha. It does not happen with other automated email from Moz such as ranking reports. It's only the email from the Q&A responses.
Best,
Christopher
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thanks, Sha. It does not happen with other automated email from Moz such as ranking reports. It's only the email from the Q&A responses.
Best,
Christopher
When I try and open an email message from a Moz Q&A response, it hangs Microsoft Outlook for more than 30 seconds. Is anyone else having this problem?
I'm sure I can outrank the page with modest effort. However, that won't solve the problem. The sale price on the page give value shoppers a false sense of the actual price, and the description that includes "discontinued" gives the impression the product is no longer in production. I need to get the page removed from their database or de-indexed from the SERPs.
A large online retailer in Europe used to sell a product that we sell in the US. They have not sold the product for more than a year but have not removed the item from their product listings. The price is marked down and the description says the product has been discontinued. They sell a very large number of items and have a high DA and this product listing ranks high in Google SERP.
As you can imagine, this causes significant problems for us. Potential customers are given the wrong price and are also being told that the product has been discontinued. I have sent numerous requests to the retailer asking them to delete the product from their database with no success. Is it possible to send a notice to Google requesting that this product page be de-indexed? Any other suggestions?
Best,
Christopher
In addition to EGOL's excellent comments, you will also have to compete with all the Amazon affiliates that sell your product, and even worse, they will create crappy ads on the web and Youtube showing your product. There are countless Youtube videos of Amazon products that are nothing more than a pan and zoom of an affiliate product set to elevator music.
I have a modest number of products that don't change often so I don't use a product data feed service, but I do use Volusion and some Volusion store owners with a large number of products do use these services. Here is a recent discussion that may prove helpful.
Best,
Christopher
Regarding 170 words, are you referring to the meta data description or the actual content in a product page? If you are referring to the latter, you can (and should ) have much more content than 170 words.
Best,
Christopher
It's perfectly natural for each page on a site to have a different keyword. Write good content for each page and both users and Google will give you two thumbs up.
Best,
Christopher
Is it OK to split a key phrase into a slug and file name, or should the entire key phrase be in the file name. For example, consider the following articles:
How to wash your car.
How to change a tire.How to replace a windshield wiper.
Will search engines recognize the "how to" in the following taxonomy:
www.domain.com/how-to/wash-your-car/
www.domain.com/how-to/change-a-tire/
www.domain.com/how-to/replace-a-windshield-wiper/
Or, should the "how-to" be included in the file name?
Best,Christopher
I am finishing up a project to move a site that is two years old from HTML to WordPress and from an old domain to a new domain. Although the URLs are not identical, there is pretty much a 1 to 1 mapping using 301 redirects in the .htaccess file. I also informed GWT of the move. It's a bit early, but so far, I have not noticed any drop in traffic.
But here's the best part. The site looks much much better and it is also responsive (about 25% of our viewers are mobile and growing). As a result, people are viewing more pages and spending more time on the site. I expect Google will notice that.
If the rankings hold, I plan to update several sites that are 20 years old from HTML to WP.
Best
Christopher
How did you build the video sitemap? Can you post a link to the video sitemap?
Best,
Christopher
Do you have an affiliate application process in place such that you can accept or reject applications?
Best,
Christopher
Thanks for the quick response. Is SEO one of the reasons your companies are BBB members? What are the PA for your BBB listings?
Best,
Christopher
Two questions about BBB.
1. The PA for some company listings is 0 and the PA for others is much higher. Is this because some companies have a link back to their listing and others do not?
2.I checked a random sample of companies with links from BBB to their company site, but the links are not listed in the OSE. Why is that?
Best,
Christopher
Excellent point on the anchor text. I reviewed a number of high ranking sites with DA of 95+ that use these links in the footer and they only use the site name in the anchor text.
Best,
Christopher
What are your thoughts on SEOs and web designers putting their own links in the footers of client sites.
When considering actual number of links, this practice is perhaps even more common in online software suites (e.g. phpBB) and online services (e.g. Volusion eCommerce), neither of which use nofollow. Regarding user experience, I'm fine with these links if they are in the footer because there are times when I really do want to know how a particular site was built. In other words, that's what good links are for--taking the user where they want to go.
Best,
Christopher
Hi Dana,
" ... you are right, one of the fundamental questions I still have is how does a bot behave when it finds an orphaned page like one of these? Does it just revert back to the sitemap and move one? Does it automatically go back to the last non-dead end page and move on from there? What does it do?"
Bots are not really like a single spider that has to crawl around the web that can get trapped when entering an orphaned page with no back-button. When a bot enters a site, it creates a list of all the internal pages that are linked from the home page. Then it visits each page on that list and keeps adding more linked pages to that list. Each time it adds more pages to the list, it only adds new unique pages and does not add duplicates. It also keeps track of which pages it has already visited. When all the pages have been visited once, and no new pages are discovered that are not already on the list, all of the pages have been crawled.
Best,
Christopher
As you probably know, the shared sites are set up as sub folders within the main site. You need to make sure that Google sees these sub folders only through their respective domain names, and does not see them as sub folders within the main domain. Otherwise, Google will think the pages are duplicates. You can control this in your robots.txt file of the main domain (i.e. instruct Google not to look at those sub folders when crawling the main site).
Best,
Christopher
Yes, I misread your post.
Best,
Christopher
Have you considered using Google Tag Manager? It's much cleaner, and the scripts run asynchronously to speed up page load times.
Best,
Christopher
I noticed that the SEOmoz bar was particularly slow yesterday to populate the PA/DA numbers. In addition, the fields are not left empty until it can populate the final numbers; sometimes the numbers will update a couple of times until the final numbers are written. So, when it's really slow, the intermediate numbers could be interpreted as the final answer. On the occasion the numbers look odd to me, I will refresh the page and that often resolves the issue. This is rare, however. Most of the time it works just fine.
Best,
Christopher
I'm not familiar with the A1 Sitemap generator, but regarding the sitemap protocol, there is a limit on the size of a single sitemap.xml file, so for large sites, the sitemap must be split into multiple sitemap.xml files. And, the protocol has a method for indexing these multiple sitemap.xml files. It's sort of like an index to an index. None of my sites exceed the sitemap file limit, so I don't know which sitemap generators use this approach, but I would guess many of them do.
Sitemap generators I have used include DMXZone which is a Dreamweaver plugin, and xml-sitemaps.com which includes a video sitemap generator.
Best,
Christopher
EDIT: PS: Your current sitemap looks fine to me.
There is a limit on the size of a sitemap and to allow for large sitemaps to be split into smaller sitemaps, the sitemap protocol includes a sitemapindex. See "Using Sitemap index files (to group multiple sitemap files)" here http://www.sitemaps.org/protocol.html. Of course, it's also possible to include the multiple sitemaps in the robot.txt file, but automated sitemap generators will likely use the sitemapindex feature so that the robots.txt file does not have to be modified as the size of the site changes.
Best,
Christopher
Thanks, that's good to know. Just wanted to confirm I was not missing something.
Best,
Christopher
Thanks, that's another useful tool, but it does not address the links to which I'm referring.
If someone posts a link to my site (they don't use any FB tools, just a raw URL) I don't know of any way to track that. I see the hits on my site via GA and other tools, but all I know is the link came from facebook.com and that's all. I don't know of any way to see who posted a link to my site on FB. It would seem to me that many SEOs would want to know this. I'm surprised this has not been discussed before on SEOmoz. Unless I'm missing something, this seems like a huge hole in data collection, given that FB is so pervasive and links from FB are so common.
Best,
Christopher
Thanks, the URL Builder will come in handy for my posts. When other people post FB links to my site, apparently there is no way to track that.
Does FB or a third party offer an app to track the source of these links? Given the shear scope of FB, this seems like a huge oversight. I can't believe someone has not solved this.
Best,
Christopher
The back-link is not available in Woorpa either. I'm guessing FB obfuscates the link.
PS: On an parallel topic, a large and growing percentage of my organic searches are encrypted. So, I don't know a large fraction of the search terms people are using to get to my site, and due to this FB issue, I don't know a large fraction of the back links people are using to get to my site. It's like I'm going backwards at an accelerating pace.
Best,
Christopher
It was Olympic BMX Caroline Buchanan on November 13. But that is working backwards. Suppose I got a hit from FB and did not know why. What then? Given that Social Media is such a huge part of SEO, I'm surprised this data is not available, but apparently not.
Best,
Christopher
Thanks for the quick response. I'm a GA novice, so maybe it's there and I don't see it, but all I see under Traffic Sources>Sources>Referrals is m.facebook.com for the source and l.php for the referral. I'm pretty sure I know where the traffic came from due to the date that an Olympic athlete posted, but how would I find that post using GA if I had not know the athlete? Does FB intentionally hide this information?
Best,
Christopher
Olympic athletes in track & field and BMX use our product in their training. When they discuss how they use our product on Facebook, we get a boost in traffic. However, all I see in programs like Woopra is facebook.com in the URL, but not the actual page. How do I determine which FB post with a link to our site is causing the increase in traffic?
Best,
Christopher
"If Google took action against your YouTube account, they may have taken action against your website for the same reason."
My thoughts exactly. Someone may have reported or Google may have discovered something on the site, and the YouTube account may have just been a tentacle. As Ryan suggestions, can you provide more information?
Best,
Christopher
It is my understanding the search volume and keyword competition are independent. For example, many people purchase socks, but the potential for profit is relatively low which limits the pricing competition. In contrast, awards for medical malpractice could be quite high, but the search volume for a rare medical procedure could be very low.
Best,
Christopher
You're right, the codes are unique. They are so small (4 characters) I did not recognize them as unique IDs.
Thanks for the info on the hostname filter as well.
Best,
Christopher
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works.
As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced.
Best,
Christopher
Is the code snippet for a Google Remarketing Tag specific to one domain, or will it collect the audience list for any webpage(s) of any domain?
Best,
Christopher
Thanks, Andy, that was my thinking too. I wanted to confirm before responding to Volusion technical support. Much appreciated.
Best,
Christopher
Affiliate links for the Volusion ecommerce shops are of the form mydomain.com/?Click=XX where XX is the affiliate ID. Volusion uses rel=canonical to redirect the affiliate links to mydomain.com. Is this a good solution? I used iDevAffiliate for another online store, and their solution was to use 301 redirects to trip off the ? string. Comments?
Best,
Christopher
Thanks. I purchased the xml-sitemap generator with video option and it seems to do the trick.
Best,
Christopher
Do you have a favorite tool for generating video sitemaps?
Best,
Christopher
You should only seek links from quality sites. Quality sites typically wont link to your site for money, but they do want quality content, and you can pay a well-known expert in that field to write quality content for that site and the article can include a link to your site. Everything should be relevant to your site (e.g. both the other site and the article). So, if you find a site you would like to link to your site, rather than asking them for a link, ask them if they would post an article written by an expert in that field.
Best,
Christopher
Thanks for the quick response. I'm not sure I worded my question well. I'm not asking about follow versus no follow or any other attribute that can be determine by crawling the web. I'm asking about user behavior. Does Google monitor user behavior to help determine page and domain rank?
Best,
Christopher
If a link to my site is posted on another site and many people start following that link and they stay on my site for a while, and I also have GA set up, does Google factor that information back into the Google rankings? In other words, are there ways that Google tries to use clicks and bounce rates and other user behavior to determine page rank and domain rank?
Best,
Christopher
Does Google look at meta tags in images?
Best,
Christopher
Can you please explain what it means to "be careful about your anchor text." Regarding anchor text, what is the difference between good and bad anchor text?
Best,
Christopher
Thanks for the info and the link to the testing tool.
To be fair, I should add that Volusion has added rich snippets for reviews, and that's certainly a step in the right direction. However, companies rolling out a new shop typically have products and no reviews, so the product snippets could be used from day one if they were implemented.
I did come up with a hack to add a schema.org description to each of my products. I'll continue to look for a hack that allows me to put meta data around all product features like price and availability, but product description will have to do for now.
Best,
Christopher
Last year I rolled my own online shop and I included schema.org microdata for all the products. My Google ranking continued to improve during that period, but I was doing a number of improvements, so I can't say how much was due to the microdata.
My store continues to grow so I moved to an eCommerce solution. I opted to go with Volusion. Much to my surprise, about the only SEO feature they have implemented is SEO friendly URLs. They have not implemented microdata, which is pretty surprising given that all their sites are geared towards products. I would think it would be very easy for them to span the product name, description, price, etc, with a product schema.
I called CoreCommerce, a Volusion competitor, and they have not implemented microdata either.
Why are these large eCommerce providers ignoring microdata?
Are there eCommerce solutions that have implemented microdata?
Do large online retailers like Amazon and Buy use microdata?
Is there any data that shows the SEO benefits of implementing microdata for an online store?
Best,
Christopher
Thanks for the quick response. Yes, I have access to edit the .htaccess file.
I'll try using the [L] tag for the virtual domains.
Best,
Christopher
My web host account allows me to have multiple domain names. Internally, the first domain is the main domain, and the additional domains are virtual domains, but externally, the intent is for each domain to appear as a unique domain. When accessing a virtual domain, the server first processes the main .htaccess file, and then processes the .htaccess file for that virtual domain. I'm sure this is a common setup, and this is not unique to my web host.
Due the main .htaccess file, references to virtual.com are rewritten as main.com/virtual. The web pages are displayed correctly, but of course, this rewrite is not what is desired. What is the common solution? For example, is there a conditional rewrite rule that says ignore the rest of the rewrite rules in this .htacces file?
Best,
Christopher