Need advice for new site's structure
-
Hi everyone,
I need to update the structure of my site www.chedonna.it
Basicly I've two main problems:
1. I've 61.000 index tag (more with no post)2. The category of my site are noindex
I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time.
Mybe it is correct just to make the category index and linking it from the post and leave the tag index.
Could you please let me know what's your opinion?
Regards.
-
Thank you so much Tommy!
-
Hi,
In SEO sense, I don't think it matters whether it is tags or category as long as Google is able to crawl your content and you removed or noindex all your duplicate content/pages. By using SEO by Yoast, category and tags are not different. You can noindex/nofollow both and you can create custom SEO titles and descriptions for both.
However, for the purpose of restructuring your site and adding customized headers,
Hope this helps!
-
Hi Tommy, sorry for my english, I'll do my best to explain you why I would like to change the structure of my site.
I think that for SEO is better to have the category index and push it is better than tag. The main problem is that at the moment my site has 61.000 tag more of them with zero post in: don't you think that it means bad content for Google?
Also, don't you think that it means high bounce rate?
Thanks for your advice and have a nice day!
-
Hi,
What is the reason for fixing the structure? What is the problem you mentioned in the question? Is it because the tags are causing duplication or some other reason? If you are ranking well in SEO with the tags, you should just leave the structure as is.
However, if you really want to update the structure of your site, i would index category and noindex tags to avoid duplicate content issue.
Hope this helps
-
Hi Oleg, thanks for your answer.
So what's your final suggest?
Have a nice day.
-
If its ranking well, don't mess with it. If its not, I would flip the two (index categories, noindex tags). The main problem with indexing so many tag pages are the duplicate issues that arise. The same post blurbs are repeated on 5+ tag pages and the tag pages don't have any unique content.
If you index just the categories, you can write up a unique, keyword targeted description for each category. This would consolidate your pages and give more authority to each, as well as reduce the instances of duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I create a new site or keep company on parent company's subdomain?
I am working with a realty company that is hosted on a subdomain of the larger, parent realty company: [local realty company].[parent realty company].com How important is it to ride on the DA of the larger company (only about a 40)? I'm trying to weigh the value of creating an entirely separate domain for simplicity of the end user and Google bots: [local company].realtor They don't have any substantial links to their subdomain, so it wouldn't a huge loss. I have a couple options... Create an entirely new site on their current subdomain, leveraging the DA of the larger parent company. Create an entirely new site on a new URL, starting from scratch (which doesn't hurt you as much as it seems it once did). Create two sites, a micro site that targets a sector of their audience that they really want to reach, plus option (1) or (2). Love this community!
Technical SEO | | Gabe_BlueGuru0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
What to do with 404 errors when you don't have a similar new page to 301 to ??
Hi If you have 404 errors for pages that you dont have similar content pages to 301 them to, should you just leave them (the 404's are optimised/qood quality with related links & branding etc) and they will eventually be de-indexed since no longer exist or should you 'remove url' in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Migrating to New site keywords question
We are converting an old static html ecommerce site to a new platform. The old site has excellent ranking for some of the products. In order to maintain our ranking we will implement 301 redirects from old to new pages (as the urls will change to SEF). I am using Googles Keyword tool (in adwords) and entering each page url of the old site (there are hundreds, I'm doing the top 50 in traffic) and generating a set of keywords, then sorting each list by global searches. For each page, Google's Keyword Tool is giving me hundreds of keywords, but in meta tags there should be no more than 15, so I need a method to choose the keywords on the new page. Question: in the new meta tags should we emphasize the most common keywords (as defined by most global searches) or the least common keywords? I would hate to lose the good ranking for the least common (long tail) keywords.
Technical SEO | | ssaltman0 -
New project old domain should I 301 redirect while new sites built
I just took on a larger scale e-commerce project and came across a tricky road block that I need some advise on. Ok I'm building the site from scratch and due to it's complexity it may take 3-4 months before I have it designed and coded. The client has a domain name that has some decent page/domain authority and I would hate to loose that while the sites being built. Currently I have nothing to display as his previous site got hacked and it was deleted by the previous web admin. Being that a blog has already been approved as part of the project I already installed wordpress to keep the domain fresh however here's the issue, I installed wordpress in a folder called blog and debating if I should 301 redirect or 302 redirect his index here? The blog will always reside in the blog folder even after launch. Will performing a 301 redirect pull all the juice away from my index page? I'm assuming yes. IF so what would occur once the project is complete and I make the ecommerce site live in the index page? Thanks in Advance! Mike
Technical SEO | | MikeDelaCruz770 -
Duplicate Content Issues - Should I build a new site?
I'm currently working on a site which is built using Zen Cart. The client also has another version which has the same products on it. The product descriptions and the vast majority of the text has been re-written. I've used the duplicate content tool and these are the results: HTML fingerprint: 0000a7ee1f07a131 0000a7ec1f07a931 92.31% Total HTML similarity: 76.33% Standard text similarity: 66.72% Smart text similarity: 45.81% Total text similarity 56.27% I considered using a different eCommerce system like Magento or Volusion. So I had a look at a few templates, chose one and then used the tool again and got the following: HTML fingerprint: 0000a7e41b012111 0000a7ec1f07a931 72.00% Total HTML similarity: 64.65% Standard text similarity: 11.69% Smart text similarity: 17.90% Total text similarity 14.80% Do you think its worth doing this? thanks Dan
Technical SEO | | TheYeti0 -
It's imposible to keep track of rankings?
Hello, here something interesting I'm Using Rank Tracker from SEOMOZ And from the link-assistant's Rank Tracker as Well... I need to track Google.com and Google.co.ve (venezuela) so I did... i got my keyword an here are my results. 1 Keyword A at google.com (united states) Rank Tracker SEOMOZ = pos 6 Rank Tracker OTHER = pos 6 Manual Query on google.com = 9 (I used the exact url seomoz tells me its using) 2 Keyword A at Google.co.ve Rank Tracker SEOMOZ = pos 8 Rank Tracker OTHER = pos 7 Manual query on google.co.ve = pos 8 So.... Why it's that?, so far I think that google.com for me down here (it actually says "Español") it's a different index? for latinamerica? only spanish pages? maybe it's because there's a couple of minutes between looking with one tool and the other... any help, would be great... Dan
Technical SEO | | daniel.alvarez0