How many tiers are the best?
-
I am working the SEO from my website and I don't know how many Tiers I need. I have read a lot but people always says different things.
What do you think?
Thanks for your help friends!
Regards,
Carlos Zambrana
-
It is a simple question. I have some tiers but I need how many tiers more I need. In addition, I have two levels from link building but I don't know if it is enough.
-
Tiers as in ... ?
If you're talking about site depth, I prefer to keep things as close to the top level as possible.
If you're talking about link building, you're asking the wrong questions.
We can help more if you define the question better.
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
Youtube SEO Best Practices
Does anyone know where to find a list of SEO best practices for Youtube? Specifically...does anyone have thoughts on the SEO benefits of an @domain.com login vs @gmail.com login? Or is adding my url to the "Associated website" channel setting sufficient for SEO purposes?
Technical SEO | | brianvest0 -
Best Practices for Image Optimisation
Hi Guys, I would love some recommendations from you all. A potential client of mine is currently hosting all their website image galleries (of which there are many) on a flickr account and realise that they could gain more leverage in Google images (currently none of their images cover off any of the basics for optimisation eg filename, alt text etc), I did say that these basics would at least need to be covered off and that Image hosting is supposedly an important factor especially when it comes to driving traffic from Google Image Search. (potentially images hosted on the same domain as the text are given more value than the images hosted at another domain like websites such as Flickr). The client has now come back saying they have done some 'reading' and that this suggests a sub-domain could be the way to go, e.g. images.mydomain.com - would love feedback on this before I go back to them as it would be a huge undertaking for them. Cheers
Technical SEO | | musthavemarketing0 -
Is 307 the best way to handle temporarily disabled items ?
I was wondering what would be the best way to handle temporarily disabled items. There is 302 and 307 and from what I understand 307 redirect is the HTTP 1.1 successor of the 302 redirect. Any one has any experience on how Google handles 307? I am thinking I 307 the temp disabled pages to a generic page like site.com/we-are-adding-some-final-touches-to-this.html where we will explain to users why an item would be disabled and will give them an option to get notification when it goes back up. Finally when it goes back up I remove the 307 redirect and make the page live.
Technical SEO | | Saijo.George0 -
Too many links on my site
Hi there everybody, I am a total SEO newbie and i am burning with questions. I had my site crawled and found out that it contains too many links. The reason is that it is a site where I constantly write news and articles and each one of them is a new Joomla item, thus a new link. I actually thought lots of content is good for SEO. How am I supposed to reduce the link amount?
Technical SEO | | polyniki0 -
How best to redirect URL from expired classified ads?
We have problem because our content are classifieds. Every ad expired after one or two mounts and then ad becomes inactive and we keep his page for one mount latter like a same page but we ad a notice that ad is inactive. After that we delete the ad and his page but need to redirect that URL to search results page which contains similar ads because we don't want to lose the traffic form that pages. How is the best way to redirect ad URL? Our thinking was to redirect internal without 301 redirection because the httacces file will be very big after a while and we are thinking to try a canonicalization because we don't want engine to think that we have to much duplicate content.
Technical SEO | | Donaab0 -
What is consider best practice today for blocking admins from potentially getting indexed
What is consider best practice today for blocking pages, for instance xyz.com/admin pages, from getting indexed by the search engines or easily found. Do you recommend to still disallow it in the robots.txt file or is the robots.txt not the best place to notate your /admin location because of hackers and such? Is it better to hide the /admin with an obscure name, use the noidex tag on the page and don't list in the robots.txt file?
Technical SEO | | david-2179970