Pages not ranking - Linkbuilding Question
-
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are:- Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that?
- Or has it just not been enough time?
- Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that?
- Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing?
The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new.
- Would linking to pages that are not ranking well be a good idea?
- Would it help them to get indexed / ranking?
- Or would it be better to link to the pages that are already indexed / ranking?
- If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site
-
These two pages are similar but definitely not duplicates. I wouldn't worry about that being the issue. The first two answers in this thread have it right, you need to build links internally and externally to these new pages to help them out. You are indexed just fine, just need some link love.
Kate
-
Do I need to re-ask the question, or repost it? Is having SEO MOZ review it like an escalation of the question?
Thanks
-
You are a pro member, so you get two questions per month. Make sure you provide a link to this thread for reference.
-
Thanks Richard - How do I get it to go upto someone at SEOMoz to confirm?
-
Page looks great by the way!
Yes, there is lots of duplicate content here. However, with the other page copy, I would think you would not get penalized.
I must admit, this should go up to someone at SEOmoz to confirm.
-
Hi Richard
Wanted to see if you could see the links, and if you feel the flickr code on those pages is a good idea or not?
Thanks
-
Thanks Richard
www.impactsigns.com/cast-bronze-plaques
compared to
-
Yes, please post a link.
I am going with that as long as there is other content on the page and not simply a redundant pulling of Flickr code, you will fine.
If you are pulling the Flickr code and on that page is just a recompilation of images in a different order, then yes, I would say a duplicate content issue will arise.
I think that answered you question?
-
Richard
Thanks for your reply. It all makes sense. Was wondering if you coudl give me some detial about #4 (Flickr code showing) as I wwant to be sure I was clear, and what we are doing is not harming us.
- So even though the images are used on multiple pages, and the code is pulling the alt tags, captions, and titles (this has actually helped us rank for some longer tail kw, and have alot of images show up in image search which is good for us) and there are alot of KW in the code block, it would not penalize us in any way?
- I know there is not reallu a duplicate content penalty per se, and more of a filter, so for each query G can choose which of our "duplicated" page is most relevant. Would this be the same here?
- We have very well written, persuasize, and KW balanced on page copy, but if you look at the source code, the % of words taken up by the flick images infor is so large compared to our sales copy. Woudl this be drowning out the kw in the sales copy?
- Could I post a coupel URLS?
Thanks
Shabbir
-
Wow, I think you used up all your Q&A points on this post alone : )
-
No
-
Could be, but probably not
-
doubt it.
-
No
1a) Yes, very much so. Link to it, blog it, tweet it, and post on Facebook and other social sites
2a) Yes
- Yes
- Make sure the page is listed in the site XML and the new XML is uploaded to Google Webmaster.
- Be sure to link to this page from strong pages on your site, or blog.
- Get outside pages linking to this page.
- Blog it, tweet it, Facebook it, etc.
I hope that helps.
-
-
I guess it really comes down to what key phrase you are trying to rank, adding new pages with unique copy doesn't mean they will rank automatically, apart from onsite factors, you need to look into external factors as well, this includes building links to the new pages or taking advantage of social signals (if this applies to your industry).
To see whether there is any duplicate copy issue, I recommend using this http://www.copyscape.com, you can check whether there are any duplicate copy floating around on the net.
In regards to the flickr images, it really depends on the alt tag, how they describe the images, I don't think there is a problem using the same images on different pages with the same alt tags but if alt tags are all keywords, that might be a problem.
In regards to link building, my recommendation is to link to the page that would benefit users the most because apart from getting traffic in, I look into getting them to the most useful page to get them to convert, and I believe Google likes this more than just ranking the homepage. So if you find the "not-ranking" page beneficial to users, I would link to it and it will help get them indexed/ranking. One other thing you need to look into is the quality of the link, make sure it's relevant to your industry, because if they are just random links, Google might not pass value at all.
Hope this helps
-
That is a whole lot of questions so let me do my best to sum it up for you.
Your new pages are not ranking because new pages don't just rank. The quality of your content helps Google know what phrases to rank your pages for. The links to that page determine its relevance and authority, or how high it will rank for those phrases.
Putting up new content just because does not guarantee any rankings. Are there internal links to these pages? Are they in your sitemap? Do they have any external inbound links coming to them?
Make sure you have internal links to these pages as well as external links to them. Make sure the content is more than just original and well written- it has to be optimized. Make sure your title tags are all unique and keyword rich. These types of basic SEO practices should be followed first and foremost. Then if nothing is ranking like you think it should after 3 months, you can look at other things.
I would imagine that if they have been indexed but aren't ranking that they just need some optimizing and some link juice. That tends to get pages ranked pretty well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
I am having an issue with my rankings
I am having an issue with my rankings. I am not sure if there are issues with onpage dup content or with the way wordpress is behaving but there is no reason based upon the sites back link profile that the site shouldn't be ranking well. The site is mesocare.org. If anyone can help it would be appreciated.
Technical SEO | | weitzluxenberg0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Drastic increase of indexed pages correlated to rankings loss?
Our ecommerce website has had a drastic increase in indexed pages, and equal loss of Google organic traffic. After 10/1 the number of indexed pages jumped from 240k to 5.7 million by the end of the year, according to GWT. Coincidentally, the sitemap tops at 14,192 pages, with 13,324 indexed. Organic traffic on some top keyphrases began declining by half after 10/26 and ranking (previously placing in the top 5 spots) has dropped to the fifth page of results. This website does produce session id's (/c=) so we been blocking /c=/ in the robots.txt file. We also have a rel=canonical on all pages pointing at the correct url. With all of this in place, traffic hasn't recovered. Is there a correlation between this spike of indexed pages and the lost keyword ranking? Any advice to investigate and correct this further would be greatly appreciated. Thanks.
Technical SEO | | marketing_zoovy.com0 -
Two different page authority ranks for the same page
I happened to notice that trophycentral.com and www.trophycentral.com have two different page ranks even though there is a 301 redirect. Should I be concerned? http://trophycentral.com Page Authority: 47 Domain Authority: 42 http://www.trophycentral.com Page Authority: 51 Domain Authority: 42 Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Root domain ranks higher than sub pages
Our website is over 10 years old and has been very successful with google. We rank highly for our keywords, but recently a strange thing has happend. Google has indexed a video from our front page (which no longer exists) and shows this listing instead of the sub pages. This means the listing is irrelevant to the search term. Term is "Junior Cricket Bats"
Technical SEO | | NickKer
https://www.google.co.uk/search?sourceid=navclient&ie=UTF-8&rlz=1T4GGLL_en-GBGB377GB377&q=junior+cricket+bats should some up with this page
http://www.cricketsupplies.com/junior-cricket-bats.asp but shows the root domain
http://www.cricketsupplies.com/ with this video link which does not exist We should be ranking really well , and were , but now not so good , even though SEOmoz loves our pages if anybody can see anything obvious i would be thanksful. regards0