Facing issue related to my website.
-
I am facing issue regarding to my website **https://mylovelylawn.com/. As some pages are not indexed by google yet. Can Anyone help me out how to indexed those pages to improve my ranking. **
-
@qwaswd hi,
Connect you site with Google Search Console, it will take up to 48 hours for google to index your site in search engine -
Hello!
Well, there is much you can do to improve the time that Google takes to index your URLs.
-
You should, before anything else, upload the sitemap of your site in Google Search Console, it will help Google to crawl your site.
-
You could use the option that Google Search Console offers you to inspect an URL you provide from your site and request its indexation.
-
Last but not least, its important that you double check the internal linking of your page. When its done right it significantly improves the time that Google takes to index any new URLs from your site.
Tip: Try to link to this new URL from an existing a indexed URL that you have on your site, as long as it looks natural and not forced or anything.
-
-
@bensteel said in Facing issue related to my website.
Google provide you an Option like URL Inspection and that way you will force Google to Index my current article on priority base. And with in few hours Google Index that article. and the other way You should have to increase the Indexing rate from Google search console setting.
can you explain that how can I increase the indexing rate using Google search Console from Where can I found it in search console for my website:[ https://tinyurl.com/kb7a2xc7 ] I have faced a lot of issue of speed indexing my competitor rank easily on new keyword as google index his articles very rapidly...
-
@qwaswd said in Facing issue related to my website.:
I am facing issue regarding to my website **https://mylovelylawn.com/. As some pages are not indexed by google yet. Can Anyone help me out how to indexed those pages to improve my ranking. **
Google provide you an Option like URL Inspection and that way you will force Google to Index my current article on priority base. And with in few hours Google Index that article. and the other way You should have to increase the Indexing rate from Google search console setting.
-
Hi qwaswd,
This appears to be a similar question to what you have in another thread. Have you tried connecting your website to Google Search Console and submitting an XML sitemap?
Best,
Zack
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about my website ranking
Hi everyone I have sports website. My website DA, DR is good. I have created good quality backlinks. But still my website ranking is not improving. Can you guide me how to improve my website ranking. Thanks in advance!
On-Page Optimization | | fktos.tam.906n0 -
Web designer doesn't see duplicate pages or issues??? Help!
I have a website that has duplicate pages, and I want to fix them. I told my web guy and his response was: "we do not see any issues on our end. It is not hurting any ranking in the 3 major searches. We have no duplicate pages" Here's the site http://www.wilkersoninsuranceagency.com/ Here's the home page again http://www.wilkersoninsuranceagency.com/index.php Here's what MOZ say: Please see attached image. I'm not sure what to tell him, as i'm not a web person, but MOZ is telling me we have issues on the site and I feel like they should be fixed?? 7QWkVU0 tYCjV
On-Page Optimization | | MissThumann0 -
Internal links are not indexed of the website
Some internal links are indexed and some not of the same page of the website, what is the so and what is the reason behind?
On-Page Optimization | | renukishor10 -
Building an optimised friendly website
We are in the process of having a new website built and was wondering what factors do we need do we need to instruct our web company to include, at the build phase, to ensure that we can easily optimise it for SEO purposes. They have designed us a previous site that has excessive duplicate URLs and they haven’t given us access to the code so we can’t add 301 redirects etc and would like to avoid this in the future. I look forward to hearing from you
On-Page Optimization | | Hardley1110 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Is it worth De-duplicating a large e-commerce website?
Hi all, Most e-commerce websites use the same product description as the manufacturers. We know duplicate content is a huge negative for SEO. We are thinking about de-duplicating ours but our website is so big - it has tens of thousands of products. To de-duplicate it would require a ton of resources. Do you think it's worth it to go ahead de-duplicate our website? Do you have a website where de-duplication was done and did you see any positive result? (if so, did you see a certain percentage increase?) Thank you in advance
On-Page Optimization | | truckguy770