Zip Code Blocks the Search Engines!
-
I have a site where when you visit the product pages, it asks for your zip code. This is obviously blocking the bots from crawling the site.
I know you can basically tell the bots how to ignore the zip code feature but I am not exactly sure how to do this.
Any help would be appreciated
-
You can also use a private Q&A question credit to ask this in private Q&A. It can take a few days for a response, but your question will be confidential, not indexed in Google, and only viewable to SEOmoz staff and associates -- and we're all under NDA to not discuss the contents.
-
I know that may help but I unfortunately cannot post their site.
-
Kindly post a link to your site and I will try to help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
Where does Movie Theater schema markup code live?
What I am trying to accomplishI want what AMC has. When searching google for a movie at AMC near me, Google loads the movie times right onto the top of the first page. When you click the movie time it links to a pop up window that gives you the option to purchase from MovieTickets.com, Fandango or AMC.com.Info about my theaterMy theater hosts theater info and movie time info on their website. Once you click the time you want it takes you to a third party ticket fulfillment site via sub domain that I have little control over. Currently Fandango tickets show up in Google like AMCs but the option to buy on my theater site does not.Questions Generally, how do I accomplish this? Does the schema code get implemented on the third party ticket purchasing site or on my site? How can I ensure that the Google pop-up occurs so that users have a choice to purchase via Fandango or on my theaters website? TSt9g
Intermediate & Advanced SEO | | ColeBField2 -
Google Search Console > Security Issues
Hi all, *Admin please feel free to remove or add this to any existing post. I have searched the community for any similar questions. While checking in the Google Search Console, under the "Security Issues" (lone section) I have found Google pointing out specific pages of our website where the message we are seeing is "Content injection - These pages appear to be modified by a hacker with the intent of spamming search results." The Learn More link takes us to https://developers.google.com/webmasters/hacked/docs/hacked_with_spam?ctx=SI&ctx=BHspam&rd=1 We've never injected spam code or have not been injected with any spammy code so what baffles me is why would Google pick this up when we have mentioned to them very clear that our code is secure and not hacked. Has anyone received a similar message and had any luck removing the message correctly? Thanks in advance!
Intermediate & Advanced SEO | | SP10 -
Is this organic search sketchiness worth unwinding?
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it. The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search. So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to. Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site. And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site! At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages. However, on some level it may have actually helped the pages linked to on the main site. The whole thing is so sketchy I wonder if I should reverse it. I could also just leave it alone and not risk hurting the pages that the blog currently links to. What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue. To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site. Please let me know what you think. Thanks!
Intermediate & Advanced SEO | | 945010 -
Block a country, will affect my ranking?
Dear Mozzers, I intend to block some certain countries from viewing my website (including proxy), will it affect my Google ranking? Thank you for your help. BR/Tran
Intermediate & Advanced SEO | | SteveTran20130 -
Blocking poor quality content areas with robots.txt
I found an interesting discussion on seoroundtable where Barry Schwartz and others were discussing using robots.txt to block low quality content areas affected by Panda. http://www.seroundtable.com/google-farmer-advice-13090.html The article is a bit dated. I was wondering what current opinions are on this. We have some dynamically generated content pages which we tried to improve after panda. Resources have been limited and alas, they are still there. Until we can officially remove them I thought it may be a good idea to just block the entire directory. I would also remove them from my sitemaps and resubmit. There are links coming in but I could redirect the important ones (was going to do that anyway). Thoughts?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Site Search Tracking Of Non Existing Products
I am working towards optimizing the site search box of an ecommerce website and I wish to track the keywords which users are searching but which are yielding no results. Please see the image for the same. I wish to assimilate data on the same which would then allow me to add products which users are searching but which the site doesn't have. However my problem is that I don't know how you could obtain this data in analytics because these results manifest itself in the form of searchresults.php. I know that analyzing search refinements and percentage of exits in Google Analytics is an option but I want a more compact and simpler solution to the problem where I could see exactly all the data in one place. Does anyone have suggestions on how this can be done? Thanks in advance, Y35Mj.png
Intermediate & Advanced SEO | | pulseseo0 -
Why do I have better results with absolute search
Hi, I completed the optimization of my website, and while my rankings have improved, i seem to have consistently better resultats with absolute search terms: ex: i'm better ranked on "word1 word2" than just word1 word2 without the guillemets. Do you have any idea why? As most people perform search without guillemets, i would like to improve my rankings with these searches too. thanks, cedric
Intermediate & Advanced SEO | | smartgrains0