Submitting URLs to Bing and Google
-
Does Submitting URLs to Bing and Google actually do anything? Is it worthwhile?
What I mean is submitting intermittently individual URLS after already submitting the sitemap.
-
If your sitemap.xml file is uptodate and/or all your links can be accessed by Google it does not make sense to submit links manually.
-
Thanks. I do it quite a lot -especially after updating my blog- but I wondered if it really was useful.
-
I don't believe submitting URLs is a strategy if that is what you mean. I rarely submit URLs but if I find a page or have a new page that I want to get crawled as quickly as possible then I will go submit. I certainly don't have any proof of how much this has helped over the years. I do believe it notifies Google and Bing to crawl.
-
Yes, it helps them get crawled quicker.
Its like telling them, please go here and find this URL ,as opposed to sitting around doing nothing about it and crying weeks later wondering why your website has not been found yet
I see you edited your post after my response
See Brads answer now
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Plus Sign "+" in url affect SEO and Ranking?
there are customized pages on the client's site, they contain brand pages related to Samsung, iPhone, ZTE, LG, Motorola, and HTC Mobile phones.
On-Page Optimization | | dietsuave
For Example:
https://www.unlockninja.com/unlock-apple+phone
https://www.unlockninja.com/unlock-zte+phone
https://www.unlockninja.com/unlock-samsung+phone
Should I recommend them to change the URL structure. ?0 -
Fetch as Google is showing this, help!
Our Fetch as Google in Google Webmaster Tools is showing this. What is this?? Thanks! https://imgur.com/k6KOQZz
On-Page Optimization | | bluejay78780 -
Google search returns blog homepage, but not article
When I do a google search for a specific article on our blog the search results only return the blog homepage with the article title shown in the meta description, but never the actual article page. I've tried to refine my search by using site: and quotation marks around the article title (e.g site:www.example.com "article title") but still only get the homepage. Our blog is showing up so I assume it's not an indexing issue, but not sure how to get the article pages to show on serps. Any ideas? Thanks!
On-Page Optimization | | STP_SEO0 -
Google Crawl Errors from vbseo change
We have vbseo setup on our site and for some reason a setting was changed unexpectedly and was un-noticed where it changed the URL of all the pages and so none of our pages were getting indexed by google any longer due to 401 errors. Most of our SE traffic fell off. We discovered the issue a couple weeks ago and we changed the setting back so that the URLs are the same as they were originally before but in Google webmasters it's still showing crawl errors and our search engine traffic hasn't recovered at all. We have sitemaps being sent daily.
On-Page Optimization | | RudySF0 -
Similar URLs
I'm making a site of LSAT explanations. The content is very meaningful for LSAT students. I'm less sure the urls and headings are meaningful for Google. I'll give you an example. Here are two URLs and heading for two separate pages: http://lsathacks.com/explanations/lsat-69/logical-reasoning-1/q-10/ - LSAT 69, Logical Reasoning I, Q 10 http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10/ - LSAT 69, Logical Reasoning II, Q10 There are two logical reasoning sections on LSAT 69. For the first url is for question 10 from section 1, the second URL is for question 10 from the second LR section. I noticed that google.com only displays 23 urls when I search "site:http://lsathacks.com". A couple of days ago it displayed over 120 (i.e. the entire site). 1. Am I hurting myself with this structure, even if it makes sense for users? 2. What could I do to avoid it? I'll eventually have thousands of pages of explanations. They'll all be very similar in terms of how I would categorize them to a human, e.g. "LSAT 52, logic games question 12" I should note that the content of each page is very different. But url, title and h1 is similar. Edit: I could, for example, add a random keyword to differentiate titles and urls (but not H1). For example: http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10-car-efficiency/ LSAT 69, Logical Reasoning I, Q 10, Car efficiency But the url is already fairly long as is. Would that be a good idea?
On-Page Optimization | | graemeblake0 -
Errors in URL´s
SEOMOZ is showing quite a lot of URL Errors like this: http://trampoliny.net.pl/akcesoria/pokrowiec-basic?frontend=1825cb1eea3af8ee6ee2d96617d32ff6 All these URL´s use the parameter "?frontend=". In webmaster tools we told google not to index this parameter. Unfortunately at the moment we cannot set this parameter as "NOINDEX". We also dont want to use a robots.txt file. How to get rid of the URLS in Seomoz?
On-Page Optimization | | drgoodcat0 -
How to Resolve Google Crawling Issues for My eCommerce Website?
I want to resolve Google crawling issues for my eCommerce website. My website is as follow. http://www.vistastores.com/ Google have crawled only 97 webpages from my website. My website is quite old. (~More than 6 months) But, Google have indexed only 97 webpages. I have created one campaign over SEOmoz tool and found some errors over there. So, I just assumed that due to it Google did not crawled my website. But, I have created one another campaign for my competitor website to know actual status and reason behind it. I found that, my competitor website have more error compare to me but, Google have crawled maximum pages compare to me. So, What is reason behind it? How can I improve my crawling rate and index maximum webpages to Google? [6133009604_af85d29730_b.jpg](img src=) 6133009604_af85d29730_b.jpg 6133009604_af85d29730_b.jpg 6139706697_4e252fdb82_b.jpg
On-Page Optimization | | CommercePundit0 -
Absolute URLs
Hi, this is a very basic question but I want to confirm, as I remembered it was consider a good practice to use the absolute version of your links when linking to other pages of your site, not for any issue related to passing authority or PageRank, but because if someone scraps your content then they would take the links as well (as if they didn't remove them). Have the practices for internal linking with absolute or realtive URLs changed in any way? Which is the best way? absolute or relative? is there any harm for using the relative version? Relative: Absolute: [](<strong><em>http://www.cheapdomain.com/myfolder/mypage.html)[](<strong><em>http://www.cheapdomain.com/myfolder/mypage.html) [Thanks!](<strong><em>http://www.cheapdomain.com/myfolder/mypage.html)
On-Page Optimization | | andresgmontero0