Index Problem
-
Hi guys
I have a critical problem with google crawler.
Its my website : https://1stquest.com
I can't create sitemap with online site map creator tools such as XML-simemap.org
Fetch as google tools usually mark as partial
MOZ crawler test found both HTTP and HTTPS version on site!
and google cant index several pages on site.
Is problem regards to "unsafe URL"? or something else?
-
Hi peter
just curious did you have to do anything specific for SEO side of things? I am working with a developer who assures me that angular will not impact on SEO but looking at their past efforts I am not too sure?
-
Hello all,
Thought I'd drop in my $0.02 about Google and Angular JS. We switched over to it two weeks ago at FixedPriceCarService.com.au - existing pages were fine, with no transition issues, and same goes for all the new pages we added. All were indexed quite quickly on Google.
Google Search Console has no issue with finding things like Page Title and Meta Description that have their home now in the DOM.
I'm curious about when MOZ might also be able to crawl the DOM, and not report missing page titles that are not truly missing, rather they've been relocated.
Scott - Fixed Price Car Service
-
Hi Hamid!
Did Peter or Martijn answer your question? If so, please mark one or both as a Good Answer.
If not, what are you still looking for?
-
Peter is indeed correct, it doesn't seem you have to worry about unsafe URLs but more about the technical way that your site is built. AngularJS is relatively new and Google and as it's JS based it's rather hard for Google to retrieve all the content already on these pages in order to 'get' the whole page. His suggestion is also spot on and will make it easier for search engines to crawl, read and index your site.
-
If you can't create xml sitemap then you can make plugin for your site with your devs for this. Google can't render your webiste because technology. I've seen that your site using AngularJS and you have tiny HTML code.
However there is solution for you described here:
https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/So if you provide fully rendered HTML to search engine bots then your site can be finally indexed proper.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Authority Social Blogs to Index
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending... Blogspot - 32/100 pages indexed Wordpress - 34/81 pages indexed Tumblr - 4/223 pages indexed Typepad - 3/63 pages indexed Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
Intermediate & Advanced SEO | | LindsayE1 -
Moving html site to wordpress and 301 redirect from index.htm to index.php or just www.example.com
I found page duplicate content when using Moz crawl tool, see below. http://www.example.com
Intermediate & Advanced SEO | | gozmoz
Page Authority 40
Linking Root Domains 31
External Link Count 138
Internal Link Count 18
Status Code 200
1 duplicate http://www.example.com/index.htm
Page Authority 19
Linking Root Domains 1
External Link Count 0
Internal Link Count 15
Status Code 200
1 duplicate I have recently transfered my old html site to wordpress.
To keep the urls the same I am using a plugin which appends .htm at the end of each page. My old site home page was index.htm. I have created index.htm in wordpress as well but now there is a conflict of duplicate content. I am using latest post as my home page which is index.php Question 1.
Should I also use redirect 301 im htaccess file to transfer index.htm page authority (19) to www.example.com If yes, do I use
Redirect 301 /index.htm http://www.example.com/index.php
or
Redirect 301 /index.htm http://www.example.com Question 2
Should I change my "Home" menu link to http://www.example.com instead of http://www.example.com/index.htm that would fix the duplicate content, as indx.htm does not exist anymore. Is there a better option? Thanks0 -
Website Does not index in any page?
I created a website www.astrologersktantrik.com 4 days ago and fetch it with google but still my website does not index on google as the keywords I use is with low competition but still my website does not appear on any keywords?
Intermediate & Advanced SEO | | ramansaab0 -
How to do Country specific indexing ?
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia. When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries Can someone help?
Intermediate & Advanced SEO | | ozil0 -
Google Indexed Old Backups Help!
I have the bad habit of renaming a html page sitting on my server, before uploading a new version. I usually do this after a major change. So after the upload, on my server would be "product.html" as well as "product050714".html. I just stumbled on the fact G has been indexing these backups. Can I just delete them and produce a 404?
Intermediate & Advanced SEO | | alrockn0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
Technical Automated Content - Indexing & Value
One of my clients provides some Financial Analysis tools, which generate automated content on a daily basis for a set of financial derivatives. Basically they try to estimate through technical means weather a particular share price is going up or down, during the day as well as their support and resistance levels. These tools are fairly popular with the visitors, however I'm not sure on the 'quality' of the content from a Google Perspective. They keep an archive of these tools which tally up to nearly a 100 thousand pages, what bothers me particularly is that the content in between each of these varies only slightly. Textually there are maybe up to 10-20 different phrases which describe the move for the day, however the page structure is otherwise similar, except for the Values which are thought to be reached on a daily basis. They believe that it could be useful for users to be able to access back-dated information to be able to see what happened in the past. The main issue is however that there is currently no back-links at all to any of these pages and I assume Google could deem these to be 'shallow' provide little content which as time passes become irrelevant. And I'm not sure if this could cause a duplicate content issue; however they already add a Date in the Title Tags, and in the content to differentiate. I am not sure how I should handle these pages; is it possible to have Google prioritize the 'daily' published one. Say If I published one today; if I had to search "Derivative Analysis" I would see the one which is dated today rather then the 'list-view' or any other older analysis.
Intermediate & Advanced SEO | | jonmifsud0 -
How long does a Google penalty last if you have fixed the problem??
Hi I stupidly thought that it would be a good idea to set up a reciprocal links page on my website named 'links'. I did this because my competitors were linking to these pages so I though it would be a good idea and I genuinely didn't know that you could be punished for this. Within about 3 weeks my rank dropped about 3 pages. I have since removed the links and the page was cached last Friday but the site still appears to have a penalty. I assumed when Google cached the page and saw the links were not there anymore that the penalty would be lifted. Anyone got any ideas? ps. The competitor websites had broken their links pages into various categories relating to the website i.e. related directories etc. so this might be why they weren't penalized.
Intermediate & Advanced SEO | | BelfastSEO0