Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fix non-crawlable pages affected by CSS modals?
-
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals.
The case: Several pages could not be crawled because of (modal:) in the URL.
What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS.
What I don't know: How to prevent crawlers from finding them.
-
Hi Dan-Louis – all depends on what content these modals are serving, and how your modals are being served.
- Have you checked to make sure that some resource your modals rely on isn't blocked in /robots.txt (you can also plug a URL with a modal into Search Console's robots.txt checker)? What about meta robots noindex?
- On mobile, do you have pop ups that are obscuring content on the page?
- Have you checked any pages with these pop ups/modals in Search Console's fetch and render tool? What resources, if any, are blocked?
TBH I don't like relying on SEMRush (or any singular SEO tool for that matter) for ID'ing technical SEO problems. When was the last time you performed a tech audit?
Keep me posted on what you uncover...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do things like using labels on an element that is not a form input affect how google sees us in regards to accessibility?
Do things like using labels on an element that is not a form input affect how google sees us? It's an accessibility error that our devs have made - using a label element because it looks good, not because it's an actual label on a form field. Just wondering how that affects accessibility in Google's eyes.
Web Design | | GregLB0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Drop Down Menus and Crawlability
Hello, We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear. Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page. I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form? Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page? Any thoughts or clarity around this would be appreciated.
Web Design | | Colbys0 -
Multi-page articles, pagination, best practice...
A couple months ago we mitigated a 12-year-old site -- about 2,000 pages -- to WordPress.
Web Design | | jmueller0823
The transition was smooth (301 redirects), we haven't lost much search juice. We have about 75 multi-page articles (posts); we're using a plugin (Organize Series) to manage the pagination. On the old site, all of the pages in the series had the same title. I've since heard this is not a good SEO practice (duplicate titles). The url's were the same too, with a 'number' (designating the page number) appended to the title text. Here's my question: 1. Is there a best practice for titles & url's of multi-page articles? Let's say we have an article named: 'This is an Article' ... What if I name the pages like this:
-- This is an Article, Page 1
-- This is an Article, Page 2
-- This is an Article, Page 3 Is that a good idea? Or, should each page have a completely different title? Does it matter?
** I think for usability, the examples above are best; they give the reader context. What about url's ? Are these a good idea? /this-is-an-article-01, /this-is-an-article-02, and so on...
Does it matter? 2. I've read that maybe multi-page articles are not such a good idea -- from usability and SEO standpoints. We tend to limit our articles to about 800 words per page. So, is it better to publish 'long' articles instead of multi-page? Does it matter? I think I'm seeing a trend on content sites toward long, one-page articles. 3. Any other gotchas we should be aware of, related to SEO/ multi-page? Long post... we've gone back-and-forth on this a couple times and need to get this settled.
Thanks much! Jim0 -
How to make sure category pages rank higher than product pages?
Hi, This question is E-Commerce related. We have product categories dividing products by color. Let's say we have the category 'blue toy cars' and a product called 'blue toy car racer', both of these could rank for the keyword 'blue toy car'. How do we make sure the category 'blue toy cars' ranks above the product 'blue toy car racer'? Or is the category page automatically ranked higher because of the higher page authority of that page? Alex
Web Design | | WebmasterAlex0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0 -
I have a button that repeats it self many times on same page, what can i do so button name does not affect my SEO?
I have a shopping car button named "Add to car" but it repeats on many pages on my website, is this affecting my seo? If yes.. What should i do so it does not affect? Should button appear on hover? Thanks
Web Design | | SeMeAntoja0 -
How would restructuring the navigation of my website affect my rankings?
I want to restructure the navigation of my website for a few reasons: 1. It isn't intuitive/clear to the user 2. It is way too big, it has too many links and thus causes the number of links on many pages to be >100. 3. I want to get rid of file extensions as part of the URLs (.html, .php) 4. I want to achieve a "tree"-like navigation system, with categories, subcategories and so on. In the process of cleaning up my website, I had to 301 redirect a lot of duplicate pages, fix broken links, etc. I have a lot of 301 redirects already, and in the process of restructuring the navigation of my website I know I'm going to get more. Will the addition of new 301 redirects have an effect on my rankings? (I'm basically going to be changing all of the URLs) What kind of SEO effect will restructuring the navigation at the top of the page (reducing the # of links on the main menu) have on my site? What is the best strategy to implement in this situation?
Web Design | | deuce1s0