Post indexing Issue
-
I am now currently having the crowing issue for the past 9 months the domain hardly Index post use to make it by using Google inspect tools for faster indexing before it's was announced that Google has temporarily disabled it.
post like https://xclusivesongs.com/meet-nigerian-most-awarded-music-artists/ takes up to 6 days before indexing.
another issue is that it's one year now I'm still can't believe that my domain is yet to dominate is brand name looks like something is wrong I don't just get it do mean each time I search the name xclusivesongs it keeps showing me like this saying do you mean?"
-
I have same question. I am facing posts indexing issue on my website. I don't what is the proper solution of this problem. Please guide me.
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How about a No-index backlink in the eye of Google
I have a doubt - when I create a backlink as a part of SEO in some website when I rechecked the same couple of days after. It hasn't indexed and I checked its robots file. It showing **User-agent: ****Mediapartners-Google ****Disallow: ****User-Agent: * ****Disallow:**However, is this create any backlink support or just this for the purpose of not indexing in google.I make it simple -"Is this kind of backlink creation support my SEO activity or Not?" In this No-index website.
Local Website Optimization | | LayaPaul0 -
Data Structure, Indexing and Canonicals
I was wondering if anyone would be able to share some data structure/indexing best practices. We have a site that has pages designed to display National/State/City level data - all pages have slight variations in the data and descriptions - but we're seeing google index some of the city level data for national level keywords. the URL structure is www.mysite.com/Country/State/City/Topic.html For example - if the query was "what is the price of beans?" we're seeing Google pick up localized versions - i.e. mysite.com/US/CA/San_Francisco/price-of-beans.html - when it should be picking up mysite.com/US/price-of-beans.html I've toyed with the idea of using the national level page as the canonical for the state/city pages - but I don't want to hurt state/city level keywords. Because some of the pages have only slight variances - we are also seeing a lot of soft 404 errors - We're assuming that Google is seeing the pages as duplicates even though the content is different. Any insight/suggestions are appreciated.
Local Website Optimization | | Nobody16081562591621 -
Site Audit: Indexed Pages Issue
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 . One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it. Here are some things I've done I ran another query "info:homepage.com" and the home page is indexed by Google. When I run a branded search for the company name the home page does come up first. The current page that is showing up first in the "site:domain.com" listing is my blog index page. Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting. In the sitemap I removed the index.php and left only the root domain as the page to index. Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page. Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings. Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
Local Website Optimization | | SEO_Matt0 -
Is there an SEO benefit to using tags in WordPress for my blog posts?
We have locations across the US and are trying to develop content so that we rank well for specific keywords on a local level. For instance, "long tail keyword search in state" or "long tail keyword search near 76244", etc. The goal is to develop those content pages via blogs to rank for those keywords. We are using Yoast and will be optimizing each post using that tool. My questions are: 1. Are there any benefits to adding a long list of tags to each post?
Local Website Optimization | | Smart_Start
2. If yes, do I need to limit the number of tags?
3. Do we need to block the indexing of yoast to those tags and categories for duplicate content issues? Any insight on the best way to optimize these blog posts with the use of tags or other avenues would be greatly appreciated.0 -
Repairing SEO issues on Different Platforms
I work for a car dealership in Southern California and have been tasked with a seemingly impossible task. They would like for me to remove Title Tags, Duplicate Content, Descriptions, and get all other SEO issues in order. The concerns I have rank in this order: 1. Remove Duplicate Metadata: When the platform spits out new pages they use template Title/Description/Keywords and we are not always informed of their addition. There are also somewhere near 1K vehicles in the inventory that are being accused of duplicate content/Metadata. The fix that I have been spit balling is adding canonical - No Follow to these pages. I am not sure that this is the best way forward, but would appreciate the feedback 2. Duplicate Content: Most of the information is supplied from the manufacturer so we have been sourcing the information back to the manufacturers site. They are showing up on random "SEO Tools" pulls as harmful to the site. Although we use the Dealers name and local area, the only way I can assume to get the heat off and possibly fix any negative ramifications is to once again use a Canonical Tag - No Follow to these pages. 3. Clean up Issues: Most of the other issues I am finding is when the website platform dumps new pages to the site without notice and creates more then 1k pages that are coming with duplicate everything. Please provide with any assistance you can.
Local Website Optimization | | BBsmyth0 -
Is Having Broken Outbound Links on old blogs posts an issue?
Please note that these old posts hardly get any traffic. Ive heard both sides on this. thanks, Chris
Local Website Optimization | | Sundance_Kidd0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Sites Verification Issues
We have a group of automotive dealerships by a website provider that causes issues when trying to verify our sites. Because they use Analytics for their data program, they install a code into our websites-stopping us from doing so properly in our back end. We also cannot verify ourselves in webmasters or adwords. We can't actually "own" any of our sites since they run a java query script from within the website. They also do not allow the use of iframes or scripts, so we can't even use the container to verify these sites. Any help or insight would be greatly appreciated as I am sure there is some way to break this to get our data and be verified.
Local Website Optimization | | spentland0