Hi Patrick thanks for your prompt reply, I consider that´s the best option.
Thanks for your advice!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi Patrick thanks for your prompt reply, I consider that´s the best option.
Thanks for your advice!
Hi MOZ users and friends.
I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog.
I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory.
That is, in the sitemap_index.xml file i have:
Domain.com/sitemap.xml (old sitemap after remove blog posts urls)
Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin)
Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this.
I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Hi Mark,
Yours is a really interesting point. Have a look at this post, from Ana Kravitz, which I suggest you to read, I believe you will find some of the answers you are searching for.
http://www.akravitz.com/tag-track-social-media-traffic-for-google-analytics/#comment-106132
Cheers from an almost sunny day in southern Spain.
Hi again Brad, the tool is definitely accurate, As I told you, basic, but accurate. I have tried it for years and double checked it, Maybe your manual search is not clean enough? Did you clean cache etc? Sorry maybe I am asking you very basic questions, but you never know...
Give it another try, I am sure of its accuracy...
If you want to send me in private a couple dozen of kw, I offer you to double check them so you can be sure you are using the correct setup.
Cheers
Hi Brad
Did you try cute rank
For free version just one domain available, but maybe worth the paid version, very, VERY basic but if you ar just looking for ranking... its ok.
Hope its useful for your purposes
Cheers
Unfortunately we usually appear to learn more from huge problems than from nice and successful experiences...
That’s the massive profit on sharing; we can profit from others experience!
Cheers
[URL]]([URL=http://imgur.com/yhmjs1P][IMG]http://i.imgur.com/yhmjs1P.jpg[/IMG][/URL])
Hi Michael / Chris
In this case, unfortunately I have a harsh experience which makes me differ with Chris´s experience; I would definitely go forward with disavowing if you are really interested on making clear for Google that you do not value those links, even if they point to non-existing urls
A couple of years back we had a client (and still have it) with a huge hack that produced thousands of links pointing to pages of our client´s hacked domain. We even suspected of a “black hat” maneuver to make damage to our client´s SEO through this procedure.
We had a hell of a problem letting Google understand that those links where empty and pointed to a 404, even after we deleted all the content and the urls didn´t exist anymore... strange behavior still happened from time to time and until the appearance of disavow tool….unexpectedly Google re indexed those links, even that they pointed already for YEARS to nonexistent 404 urls
I would take the time and disavow if you are really interested on Google not indexing those links.
I hope you can profit from this experience.
Cheers to you both, form sunny southern Spain!