You're not giving the search engines different content, so it's not deceptive. I can't think of any way it would harm you.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by Kurt_Steinbrueck
-
RE: Session IDs and crawlers
-
RE: Session IDs and crawlers
I can't speak to the technical side of setting up session IDs, however, you can deal with the URL issue with canonical tags and setting up URL parameters in Google and Bing Webmaster Tools. That should prevent the search engines from indexing every URL with a different session id and keep all the page authority on the main URL.
Kurt Steinbrueck
OurChurch.Com -
RE: After reading of Google's so called "over-optimization" penalty, is there a penalty for changing title tags too frequently?
I wouldn't say that it wouldn't hurt your site, but it may confuse the search engines a bit and you may see a good bit of movement for a while.
That said, I think you might want to rethink what you're planning. You indicated that when you changed your title tags you got an increase in rankings and CTR. That's awesome! If the number of impressions decreased during a time when your rankings improved then the decrease would, by definition, have to be due to a decrease in the overall searches. Normally, you rank higher and a greater percentage of people see your listing. So, a ranking improvement leads to an increase of impressions if the traffic for the keyword remains constant. So, if you saw a decrease in impressions, it would indicate the search traffic dropped and you cannot improve that with a Title tag change.
Kurt Steinbrueck
OurChurch.Com -
RE: 302 redirect used, submit old sitemap?
You're right that the search engines are treating the new pages like...well...new pages. It has nothing to do with how much code has changed and everything to do with the fact that they simply have new URLs.
I agree with Alan. Two weeks isn't a terribly long time. Obviously, it's best to have all your ducks in a row from the start, but I think there's good chance that just from setting up the proper redirects for the pages the site should now transfer, though it may take few weeks and you may not get completely back to where you were.
As far as submitting the sitemap for all the old pages, I'm not sure what that would do. It's possible it may do exactly what you want, basically tell Google about all the redirects, but then again, Google may think it's a bit odd putting up a sitemap to redirected pages.
-
RE: Do you have to pay Yext at this point?
Thanks Miriam and Vadim,
Thank you for your responses. My question was a reaction to seeing Yext from being a way to distribute local info to sites, like American Towns and Local.com having their "claim your listing" option take you to Yext. Maybe I clicked on the wrong link on Hotfrog, but I thought they had gone to the Yext dark side, too. It's not that these sites are super important. I like to get as many of the local players to have the correct info as possible. Just seeing multiple local listing sites moving to Yext seemed like a trend. I'm now thinking my question was an over-reaction.
Also, thanks for pointing me to David Mihm's post. I had missed that.
Thanks!
Kurt
-
RE: Link Juice + multiple links pointing to the same page
Remus's answer is good. I would add to that that Google has their first link filter. If you have two links pointing from page A to page B, Google only passes link authority (pagerank) and reputation (keywords in the anchor text and relevant surrounding text) through the first link that appears in the code. The second link does not pass anything. So, whatever the anchor text of the first link in the code is, that's the anchor text Google is going to use (Remus is right that anchor text has become less important).
The second link does, however, dilute the amount of pagerank passed. So, like Remus pointed out, each link in your scenario only passes 20% of the pagerank. Since Google ignores the second link to the shoe page, that 20% of pagerank does not get passed. I'm not sure if it stays on the page or just gets lost.
So, what does this all mean? From an SEO standpoint, you want the link with the targeted keyword to be first in the code if you have more than one link to a page. Also, you don't really want to have two links to the same page on that one page. Now, that's from an SEO perspective. From a user perspective, it may make perfect sense to have that second link and the page may convert better. So, you'd just have to decide which is more important...and it's probably the user perspective that's more important.
Kurt Steinbrueck
OurChurch.Com -
Do you have to pay Yext at this point?
Over the past several months it seems more and more local listing sites are now using Yext for their listing information. Some of these include Local.com, American Towns, Hot Frog, etc. I'm not even seeing a way to claim listings anymore with these sites without going through Yext.
If Yext has the wrong information, is there any way to correct these listings without paying Yext? I used to be able to claim listings with the actual listing sites. It was more labor intensive, but I didn't have to pay Yext $500/year. I could pay an assistant a lot less and they could do it. It seems that option is going away.
Do any of you know of another way of correcting listings without using Yext (or at least without paying Yext)? If not, do you know if Yext has an enterprise solution for SEOs so we don't have to pay the $500 for every client?
Thanks.
Kurt Steinbrueck
-
RE: How can I block incoming links from a bad web site ?
Hi Yiannis,
As far as I'm aware, there isn't really a way to "block" a link. The link is seen on the other site. Returning a 404 for the page being linked to doesn't change the fact that there are a 100K links from one site pointing at your site. The only options I'm aware of are to 1.) contact the owner of the website with the links and ask them to remove the links and 2.) if that doesn't work disavow the links.
I understand your hesitancy to use the disavow tool, but quite frankly, this is exactly what it is intended for.
If you feel comfortably with the links being there and think Google has already dealt with them, then do nothing, but if you want to do something about the links, you either have to get them removed or disavow them.
BTW - My understanding of the partial manual actions is that often times Google not only deals with the suspicious links (devaluing them), but they also penalize the pages/keywords they think you were attempting to manipulate. So, just because it was a partial action and not a full site action doesn't mean it's not effecting some of your rankings. It's just not going to affect all your rankings for all your pages.
Kurt Steinbrueck
OurChurch.Com