Search box within search results question
-
I work for a Theater news website. We have two sister sites, theatermania.com in the US and whatsonstage.com in London.
Both sites have largely the same codebase and page layouts. We've implemented markup that allows google to show a search box for our site in its results page.
For some reason, the search box is showing for one site but not the other: http://screencast.com/t/CSA62NT8
We're scratching our heads. Does anyone have any ideas?
-
Depending on how long ago it was that you added the search box feature, it could very well be that Google's recently swept one site but not the other. As that's something more or less out of your control, I suggest looking through the code for the in-SERP search box and making sure that it's 100% accurate.
Other than these things, I can't think of any reason why the search box wouldn't be on one but the other.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mapping ALL search data for a broad topic
Hi All As our company becomes a bigger and bigger entity I'm trying to figure out how I can create more autonomy. One of the key areas that needs fixing is briefing the writers on articles based on keywords. We're not just trying to go after the low hanging fruit or the big money keywords but actually comprehensively cover every topic and provide actual good quality up to date info (surprisingly rare in a competitive niche) and eventually cover pretty much every topic there is. We generally work on a 3 tier system on a folder level, topics and then sub-topics. The challenge is getting an agency to: a) be able to pull all of the data without being knowledgeable in our specific industry. We're specialists and, thus, target people that need specialist expertise as well as more mainstream stuff (the stuff that run of the mill people wouldn't know about). b) know where it all fits topically as we kind of organise the content on a heirarchy basis. And we generally cover multiple smaller topics within articles. Am I asking for the impossible here? It's the one area of the business I feel the most nervous about creating autonomy with. Can we become be as extensive and comprehensive as a wiki-type website without having somebody within the business that knows it providing the keyword research. I did a searh for all data using the main two seed keywords for this subject on ahrefs and it came up with 168000 lines of spreadsheet data. Obviously this went way beyond the maximum I was allowed to export. Interested in feedback and, if any agencies are up for the challenge, do let me know! I've been using moz pro for a long time but have never posted and apologise if what I'm describing is being explained badly here. Requirements Keywords to cover all (broad niche) related queries in the UK, no relevant uk (broad niche) keywords will be missed Organised in a way that can be interpreted as article brief and folder structure instructions. Questions How would you ensure you cover every single keyword? Assuming no specialist X knowledge, how will you be able to map content and know which search queries belong in which topics and in what order. Also (where there is keyword leakage from other regions) how will you know which are UK terms and which aren’t? With minimal X knowledge – how will you know whether you’ve missed an opportunity or not (what you don’t know you don’t know) What specific resources will you require from us in order for this to work? What format will the data be provided to us in - how will you present the finished work so that it can be turned into article briefs?
Intermediate & Advanced SEO | | d.bird0 -
Search console site verification
I've been going on the assumption that when verifying a website in search console, it's always good to register and verify all variants of the site URL: http https www non-www However, if you create redirects to the preferred URL, is it really necessary to register/virfy of the other three? If so, why?
Intermediate & Advanced SEO | | muzzmoz0 -
Slideshare - Links within
Hi Guys I am going to be putting some powerpoint presentations up over time. I have a couple of questions regarding slideshare. If I add links to the slideshare are these crawl able by Google etc...? If I places the powepoint presentation on our website and slideshare would this be counter productive i.e duplicate content? Love to here your suggestions.
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Is un-searched content worth writing?
Hi, Is every post you write on your site is SERPs worthy? I'll give an example -
Intermediate & Advanced SEO | | BeytzNet
We often cover industry related news items. It is written very well with personal opinions, comments and detailed explanations. Our readers find it interesting, "like" and "plus" it. However, these items will never appear in the SERPs simply because they won't be searched. Needless to say that these are not ever green pieces. If by chance it lands a subject that may be searched in the future, usually it won't appear because it means that the item was also covered by major sites like CNN, Forbes, Bloomberg etc. Is it worth out time to keep "investing" in these types of articles? Thanks0 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Keyword Ranking Question
I have recently hired a SEO company to help with our keyword. My question is what are the best tools to use to verify what that are reporting. I can do an unpersonalized search, but I am likely still getting the my local results. I have been using the SEOmoz rank tracker in the past but for some reason it is not able to retrieve results over the past day or so. Are there any other good tools to check ranking for an exact url at the for non-localized, non personalized results? Thanks for the suggestions.
Intermediate & Advanced SEO | | fertilityhealth0