Is there a way to get a list of urls on the website?
-
For example, www.laskeimages.com
Outside of Google Search Console, is there another way?
-
You can run a "site:yourwebsite.com" search on google and see what is returned. Based on the results you can run further "site:" searches along with more parameters to see more URLs in the index.
A general "site:yourwebsite.com" idea about how many pages are in the index, but it's not 100% accurate.
-
slickplan.com allows you to pull the entire sitemap of your site and pull links, descriptions etc
-
If all of the pages you are interested in are linked internally from somewhere in your site which can be reached through navigation or page links, you can run a simulated crawl with a tool like ScreamingFrog, whcih will discover all the "discoverable" pages.
The site you referenced is built with a platform called "Good Gallery", whcih generates a sitemap. This is at www.laskeimages.com/sitemap.xml. I'm not sure what criteria it might use to include/exclude pages, but that would likely be a good list. You will need to view the page source of that page to see the data in a structured way to extract it.
Another method is to use Google Analytics. Assuming that each page of your site has been viewed at least once in its history, you could extract the list from Google Analytics. Especially from an unfiltered view which includes visits by bots.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rankings preferring English URL
We've recently had a redesign of our website and we have both a Dutch and an English version. However, in MOZ for both NL and BE-NL it seems to favor the English URLs. This never used to be the case and I'm wondering why it's happening and whether it could actually be hurting our SEO, as search engines would favor local languages for search queries.
Local SEO | | Billywig0 -
How should i Get Max Traffic To my website
Hello Dear Moz friends, I am New to This Forum, I just Want to ask How should I get maximum traffic to My website??
Local SEO | | falguniwpi
As i Have done All The things Correct, But i am Not getting Good traffic To my website, Posted on Forums, regular facebook posting, G+, But Still i am Not Getting good visitors to my website. Please Suggest me some Good Traffic Generation Sources, Thank you0 -
302 redirection from .com to .in. Google is indexing both urls
Hello Fellow members, I am sharing the problem what I am facing from client which is another division of my company ( taking as a client). Please recommend me a full proof solution. My client runs a fashion e-commerce site by .com domain in India but after 2 years they took decision that in India, only .in domain site would run with INR prices & outside in "$" prices. Now when If someone is searching with .com domain in India site is 302 redirecting into the .in domain. In India only .in site is working & outside .com but Google is indexing pages of both sites. With .com domain 5 lakhs + pages are indexed & from .in domain only 2600 pages. Content of both sites almost 95% same. I already recommended to put rel=canonical tag on both sites but this is not the permanent solution. They have started .in domain to show prices in "$" & "INR" only. Can you recommend me the best possible answer to solve this issue.
Local SEO | | sourabhrana0 -
Google cache is showing the wrong URL with CCTLD's
Hi Folks, At Lightspeed we decided to setup local websites with CCtld's. Momentarily we have issues with the Google cache. I'm not sure what's going wrong. For example if I check the Google cache of www.lightspeedhq.be in the Belgium Google it refers to www.lightspeedhq.nl. See link: https://webcache.googleusercontent.com/search?q=cache:fm0XIZ8sEe8J:https://www.lightspeedhq.be/+&cd=2&hl=nl&ct=clnk&gl=be We have the same problem for our www.lightspeedhq.co.uk website, which is referring to www.lightspeedhq.com: https://webcache.googleusercontent.com/search?q=cache:OXdAIIFa7AYJ:https://www.lightspeedhq.co.uk/+&cd=1&hl=en&ct=clnk&gl=uk Does Google sees it as duplicate content? Or don't we have to use 'Alternative Hreflang'? A week ago we changed our canonical links which were actually randomly referring from .be > .nl and .co.uk to .com. What can we do now to make sure all is properly indexed? Best, Ruud
Local SEO | | Ruudst0 -
How can you add custom Structured Data to a website hosted on Squarespace?
I have a client with a simple one page landing website hosted on Squarespace. Is there any method to easily apply structured data to this format?
Local SEO | | RosemaryB0 -
How to find best local websites?
For example, I'd like to type in a zipcode and get the highest ranking websites by DA/whatever metric the software uses, within a 25 mile radius? Does that type of service exist? I'm looking to build up our local links, but most of the websites have extremely low authority. I'm trying to find some good ones without having to manually check each one. Thanks, Ruben
Local SEO | | KempRugeLawGroup1 -
Benefits of "Buffer Websites" Marketing for Real Estate Firm.
A local seo firm has approached us and mentioned that we should incorporate something called buffer websites into our SEO. I work with a Houston based real estate agency that sells single family homes. In a nut shell they suggested that we create 5-10 separate standalone websites that each have 40-50 pages of unique copy for specific targeted keywords of our choosing. The idea would by that all these websites and copy would all point back to our main website and help generate substantial traffic. This concepts seems to have been a couple years ago and now sure if we should go this route. Would we be better off just building unique copy on the main site and not focus on the 5-10 websites we would need to deal with? C
Local SEO | | RETEX0 -
Content Across International Websites
I am wondering if anyone could clear up some questions I have regarding international SEO and how to treat the content placed on there. I have recently launched several websites for a product internationally, each with the correct country domain name etc and I have also followed the guidelines provided by webmaster tools on internationalisation. All the websites are targeted towards English speaking countries and I have rewritten most of the of the content on there to suite the English style of the targeted country. This is being said however I am finding mixed bags of information on what to do in treating large chunks of potential duplicate content. For example my main .com website which has been running several years (and is targeted to the UK) has a lot of well written articles on there which are popular with the visitors. I am needing to find out if duplicating these articles onto the international versions of the websites, without rewriting them, would have a detrimental effect on SEO between all the sites. I have done a site search for each domain name to see if they are cropping up in other local Google versions (e.g .ca site in Google.com.au etc) and they are not. Does this mean Google is localised to its results regarding duplicate content or is it treated at the root level? Any information to point me in the right direction would be a big help.
Local SEO | | Rj-Media0