Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Struggling to get my lyrics website fully indexed
-
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research!
My website is http://www.lyricstatus.com - basically it's a lyrics website.
Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now.
My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed.
I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed!
I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google.
Any help or advice greatly appreciated guys!
-
You need more unique content. Your site is great I like it much btter then the other lyic sites.
but I can't see any content at all you have written yourself.
-
I agree with Stephen. Tons of lyrics websites out there.
If you want to get your site more visible write a couple to a few hundred words about each song and post it on the pages above or beside the lyrics. Then you will have something unique.
Try that on a couple dozen pages to see what happens. Give it a few months.
-
You have exactly the same content as a million other lyrics websites, so why should Google be interested in your PR0, PA18, DA2 website?
I think your doing pretty good with 15000 pages indexed via site:http://lyricstatus.com
I think what you need is a USP, not technical seo responses
-
Do you have any organization to your site? I can see where some visitors would desire to find lyrics by year, singer, music style (jazz, rock, etc), music type (love songs, happy songs, etc) and so forth.
Even if users found songs by searching, crawlers move through your site through links. Unless your site is extremely well linked and has a great navigation system, you are only going to see a relatively small percentage of your site indexed.
-
Wow, that was a quick response, thanks so much Ryan!
With regards to Google WMT, yep done that as soon as I went live, and I did try and make a sitemap using xml-sitemaps.org's tool, but where I have 700,000+ songs, the XML sitemap generator kept stalling due to lack of RAM. I did upload a partial sitemap though, but to date the "URLs in web index" is stuck at 363... out of 700,000+!!
You're right, I don't have a nav as I believe users will just use the search, but there is a "Browse" link in the footer which appears on every page, and this is effectively my Site Map: http://www.lyricstatus.com/browse
So as far as I'm concerned there is a static link path to every page in my website, correct me if I'm wrong?
Good point in your last para about a unique couple hundred words on each page - tall order for 700k pages, but could definitely do that for key songs that I want to get ranked for. Thanks again Ryan!
-
Hi Ed.
A few things you can do to help get your pages indexed:
1. If you have not done so already, register with Google and go to the Google Webmaster Tools page http://www.google.com/webmasters
2. If you have not already done so, create a XML sitemap. Ideally it should be located at http://www.lyricstatus.com/sitemap
3. If you want to locate the sitemap anywhere else, you will need to create a robots.txt file and place the sitemap URL in the file. I noticed you didn't have a robots.txt file. You can learn more about them at robotstxt.org.
4. In Google WMT, go ahead and upload your sitemap (Site Configuration > Sitemap). Then check back a day later. What you want to look at is two fields: URLs submitted and URLs in index. Your goal would be to have all your URLs in the index, but that isn't realistic without a lot of work.
5. Another thing you can do is create a HTML sitemap and place a link in the footer of your home page. You don't offer site navigation so a HTML sitemap can help visitors navigate your site.
Take these steps for now and then you will have a much better idea where your site stands. You can then match up your URLs in the sitemap with the URLs in Google's index. The urls without a match are the pages you need to get into the index.
You can try link building or even placing links to these buried pages on your home page to help get them indexed.
One last note concerning duplicate content. You really should consider adding original content to the pages to help them not be considered duplicate content. Keep in mind the page is viewed as a whole so if you have a song, you probably need to write at least a couple hundred words to differentiate your pages from all the other similar pages on the web.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Z-indexed content
I have some content on a page that I am not using any type of css hiding techniques, but I am using an image with a higher z-index in order to prevent the text from being seen until a user clicks a link to have the content scroll down. Are there any negative repercussions for doing this in regards to SEO?
Technical SEO | | cokergroup0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
No index on subdomains
Hi, We have a subdomain that is appearing in the search results - I want to hide this as it looks really bad. If I were to add the no index tag to the sub domain would URL would this affect the whole domain or just that sub domain? The main domain is vitally important - it is just that sub domain I need to hide. Many thanks
Technical SEO | | Creditsafe0 -
How to block text on a page to be indexed?
I would like to block the spider indexing a block of text inside a page , however I do not want to block the whole page with, for example , a noindex tag. I have tried already with a tag like this : chocolate pudding chocolate pudding However this is not working for my case, a travel related website. thanks in advance for your support. Best regards Gianluca
Technical SEO | | CharmingGuy0 -
How do I get out of google bomb?
Hi all, I have a website named bijouxroom.com; and I was in the 7th page for the search term takı in google; and 2nd page for online takı. Now, I see that in one day my results seem to be on the 13th and 10th page in google respectively. I made too much anchor text for takı and online takı. What shall I do to gain my positions back? Thanks in advance. Regards,
Technical SEO | | ozererim0 -
How to tell if PDF content is being indexed?
I've searched extensively for this, but could not find a definitive answer. We recently updated our website and it contains links to about 30 PDF data sheets. I want to determine if the text from these PDFs is being archived by search engines. When I do this search http://bit.ly/rRYJPe (google - site:www.gamma-sci.com and filetype:pdf) I can see that the PDF urls are getting indexed, but does that mean that their content is getting indexed? I have read in other posts/places that if you can copy text from a PDF and paste it that means Google can index the content. When I try this with PDFs from our site I cannot copy text, but I was told that these PDFs were all created from Word docs, so they should be indexable, correct? Since WordPress has you upload PDFs like they are an image could this be causing the problem? Would it make sense to take the time and extract all of the PDF content to html? Thanks for any assistance, this has been driving me crazy.
Technical SEO | | zazo0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0