No structured sitemap
-
Hello
We face this problem that a lot of sitemaps are structurally not good. In this case we used the WP sitemap plugin to generate the website sitemap and Google XML sitemaps to generate the sitemap for Google.
We also bought the Yoast premium plugin, but we can read in the backend that the plugin XML sitemaps may cause problems in combination with Yoast. Normally the Google XML sitemap generator improves SEO using sitemaps for the best indexation by search engines, but the structure is not as we want it. Will Yoast be a better solution to generate structured sitemaps?
This is a section from the current sitemap of www.rovana.be.
Products
- Reepgordijn
- Plissé - Dupli gordijn
- Duo rolgordijn
- Paneelgordijn
- Jaloezie - Vlinderjaloezie
- Poorten
- Muggenramen
- Velux accessoires
- Rolgordijn
- Vouwgordijn
- Buitenjaloezie
- Voorzetrolluik
- Glasdak
- Glaswand
- Vouwdak
- Pergola
- Verlichting - Verwarming
- Automatisering
- Lamellendak
- Verandazonwering
- Screens
- Koepel zonwering
This is how we think the sitemap should look like. We would like more structure in the different product categories.
Producten
- Zonwering
Zonnescherm
Screens
Verandazonwering
Koepel zonwering
Automatisering
Verwarming – verlichting- Terrasoverkapping
Lamellendak
Pergola
VouwdaK
Glasdak
Glaswand- Raamdecoratie
Rolgorijn
Paneelgordijn
Duo rolgordijn
Vouwgordijn
Plissé – dupli gordijn
Jaloezie – vlinderjaloezie
Reepgordijn
Velux accessoires- Rolluiken
Voorzetrolluiken
Buitenjaloezie
Velux accessoires- Muggenramen
Muggenraam
Velux accessoires- Poorten
Sectionaal poort
Is this technically possible to create similar sitemaps in WordPress and how exactly do we proceed here? What is the impact of these changes on SEO? How can we make this work?
Thanks!
-
As far as I know how Yoast can create the sitemaps it's just a dump of all the articles that you have in WordPress. So your use case wouldn't be solved with just installing the plugin. You would probably have to work with either what you have or put some effort into rewriting how there sitemaps are being generated.
-
Is your goal to create a standard HTML sitemap, an XML sitemap, or both?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to implement multilingual sitemaps when not all pages have translations
We are trying to implement sitemaps for a site that has localized content for a few countries. We’ve concluded that we should utilize sitemapindex and then create one sitemap per country. Now to the problems we’re facing. Not all urls on the site have translations, how should these urls be presented in the sitemap? Should they be stated simply like so? <url><loc>https://example.com/sdfsdf</loc></url> So urls with the hreflang attribute and without are mixed in the same sitemap, or is that a problem? (I have added empty rows to make it easier to read) <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" <br="">xmlns:xhtml="http://www.w3.org/1999/xhtml"></urlset> <url><loc>http://www.example.com/english/page.html</loc>
Technical SEO | | Telsenome
<xhtml:link rel="alternate" hreflang="de" href="http://www.example.com/deutsch/page.html"><xhtml:link rel="alternate" hreflang="de-ch" href="http://www.example.com/schweiz-deutsch/page.html"><xhtml:link rel="alternate" hreflang="en" href="http: www.example.com="" english="" page.html"=""></xhtml:link rel="alternate" hreflang="en" href="http:></xhtml:link></xhtml:link></url> <url><loc>http://www.example.com/page-with-no-translations</loc></url> <url><loc>http://www.example.com/page-with-no-translations2</loc></url> <url><loc>http://www.example.com/page-with-no-translations3</loc></url> <url><loc>http://www.example.com/deutsch/page.html</loc>
<xhtml:link rel="alternate" hreflang="de" href="http://www.example.com/deutsch/page.html"><xhtml:link rel="alternate" hreflang="de-ch" href="http://www.example.com/schweiz-deutsch/page.html"><xhtml:link rel="alternate" hreflang="en" href="http://www.example.com/english/page.html"></xhtml:link rel="alternate"></xhtml:link></xhtml:link></url>0 -
Indexed, not submitted in sitemap
I have this problem for the site's blog
Technical SEO | | seomozplan196
There is no problem when I check the yoast plugin setting , but some of my blog content is not on the map site but indexed. Did you have such a problem? What is the cause? my website name is missomister1 -
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
Sitemap error
Hi, When i search for my blog post in google i get sitemap results, and when i click on it i get an error, here is the screen shot http://screencast.com/t/lXOIiTnVZR1 http://screencast.com/t/MPWkuc4Ocixy How can i fix that, it loos like if i just add www. it work just fine. Thanks
Technical SEO | | tonyklu0 -
Sitemap for dynamic website with over 10,000 pages
If I have a website with thousands of products, is it a good idea to create a sitemap for this website for the search engines where you show maybe 250 products on a page so it makes it easy for the search engine to find the part and also puts that part closer to the home page? Seems like google likes pages that are the closest to the home page (less clicks the better)
Technical SEO | | roundbrix0 -
Getting a Video Sitemap Indexed
Hi, A client of mine completed a video sitemap to Google Webmaster Tools a couple of months ago. As of yet the videos are still not indexing in Google. All of the videos sit on the one page but have unique URLs in the sitemap. Does anybody know a reason why they are not being indexed? Thanks David
Technical SEO | | RadicalMedia0 -
Domain restructure, sitemaps and indexing
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school. The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it) To this end, I took categories like: /body/amazing-big-shoes/
Technical SEO | | magdaknight
/style/red-boots/
/technology/cyber-boots/ And rehoused all the content like so, doing it all manually with ftp: /boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/ I placed 301 redirects in the .htaccess file like so: redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/ (not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.) Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so: <url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url> And resubmitted the sitemap to Google Webmasters. This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939. Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely? I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ? PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted0 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0