Pages not ranking - Linkbuilding Question
-
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are:- Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that?
- Or has it just not been enough time?
- Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that?
- Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing?
The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new.
- Would linking to pages that are not ranking well be a good idea?
- Would it help them to get indexed / ranking?
- Or would it be better to link to the pages that are already indexed / ranking?
- If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site
-
These two pages are similar but definitely not duplicates. I wouldn't worry about that being the issue. The first two answers in this thread have it right, you need to build links internally and externally to these new pages to help them out. You are indexed just fine, just need some link love.
Kate
-
Do I need to re-ask the question, or repost it? Is having SEO MOZ review it like an escalation of the question?
Thanks
-
You are a pro member, so you get two questions per month. Make sure you provide a link to this thread for reference.
-
Thanks Richard - How do I get it to go upto someone at SEOMoz to confirm?
-
Page looks great by the way!
Yes, there is lots of duplicate content here. However, with the other page copy, I would think you would not get penalized.
I must admit, this should go up to someone at SEOmoz to confirm.
-
Hi Richard
Wanted to see if you could see the links, and if you feel the flickr code on those pages is a good idea or not?
Thanks
-
Thanks Richard
www.impactsigns.com/cast-bronze-plaques
compared to
-
Yes, please post a link.
I am going with that as long as there is other content on the page and not simply a redundant pulling of Flickr code, you will fine.
If you are pulling the Flickr code and on that page is just a recompilation of images in a different order, then yes, I would say a duplicate content issue will arise.
I think that answered you question?
-
Richard
Thanks for your reply. It all makes sense. Was wondering if you coudl give me some detial about #4 (Flickr code showing) as I wwant to be sure I was clear, and what we are doing is not harming us.
- So even though the images are used on multiple pages, and the code is pulling the alt tags, captions, and titles (this has actually helped us rank for some longer tail kw, and have alot of images show up in image search which is good for us) and there are alot of KW in the code block, it would not penalize us in any way?
- I know there is not reallu a duplicate content penalty per se, and more of a filter, so for each query G can choose which of our "duplicated" page is most relevant. Would this be the same here?
- We have very well written, persuasize, and KW balanced on page copy, but if you look at the source code, the % of words taken up by the flick images infor is so large compared to our sales copy. Woudl this be drowning out the kw in the sales copy?
- Could I post a coupel URLS?
Thanks
Shabbir
-
Wow, I think you used up all your Q&A points on this post alone : )
-
No
-
Could be, but probably not
-
doubt it.
-
No
1a) Yes, very much so. Link to it, blog it, tweet it, and post on Facebook and other social sites
2a) Yes
- Yes
- Make sure the page is listed in the site XML and the new XML is uploaded to Google Webmaster.
- Be sure to link to this page from strong pages on your site, or blog.
- Get outside pages linking to this page.
- Blog it, tweet it, Facebook it, etc.
I hope that helps.
-
-
I guess it really comes down to what key phrase you are trying to rank, adding new pages with unique copy doesn't mean they will rank automatically, apart from onsite factors, you need to look into external factors as well, this includes building links to the new pages or taking advantage of social signals (if this applies to your industry).
To see whether there is any duplicate copy issue, I recommend using this http://www.copyscape.com, you can check whether there are any duplicate copy floating around on the net.
In regards to the flickr images, it really depends on the alt tag, how they describe the images, I don't think there is a problem using the same images on different pages with the same alt tags but if alt tags are all keywords, that might be a problem.
In regards to link building, my recommendation is to link to the page that would benefit users the most because apart from getting traffic in, I look into getting them to the most useful page to get them to convert, and I believe Google likes this more than just ranking the homepage. So if you find the "not-ranking" page beneficial to users, I would link to it and it will help get them indexed/ranking. One other thing you need to look into is the quality of the link, make sure it's relevant to your industry, because if they are just random links, Google might not pass value at all.
Hope this helps
-
That is a whole lot of questions so let me do my best to sum it up for you.
Your new pages are not ranking because new pages don't just rank. The quality of your content helps Google know what phrases to rank your pages for. The links to that page determine its relevance and authority, or how high it will rank for those phrases.
Putting up new content just because does not guarantee any rankings. Are there internal links to these pages? Are they in your sitemap? Do they have any external inbound links coming to them?
Make sure you have internal links to these pages as well as external links to them. Make sure the content is more than just original and well written- it has to be optimized. Make sure your title tags are all unique and keyword rich. These types of basic SEO practices should be followed first and foremost. Then if nothing is ranking like you think it should after 3 months, you can look at other things.
I would imagine that if they have been indexed but aren't ranking that they just need some optimizing and some link juice. That tends to get pages ranked pretty well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do with all these 404 pages?
I have a website that Im currently working on that has been fairly dormant for a while and has just been given a face lift and brought back to life. I have some questions below about dealing with 404 pages. In Google WMT/search console there are reports of thousands of 404 pages going back some years. It says there are over 5k in total but I am only able to download 1k or so from WMT it seems. I ran a crawl test with Moz and the report it sent back only had a few hundred 404s in, why is that? Im not sure what to do with all the 404 pages also, I know that both Google and Moz recommend a mixture of leaving some as 404s and redirect others and Id like to know what the community here suggests. The 404s are a mix of the following: Blog posts and articles that have disappeared (some of these have good back-links too) Urls that look like they used to belong to users (the site used to have a forum) which where deleted when the forum was removed, some of them look like they were removed for spam reasons too eg /user/buy-cheap-meds-online and others like that Other urls like this /node/4455 (or some other random number) Im thinking I should permanently redirect the blog posts to the homepage or the blog but Im not sure what to do about all the others? Surely having so many 404s like this is hurting my crawl rate?
Technical SEO | | linklander0 -
Can I use high ranking sites to push my competitors out of the first page of search results?
I'm looking at a bunch of long tail low traffic keywords that aren't difficult to rank for. As I was idly doing a boring task my mind wandered and I thought.... Why don't I ask lots of questions about these keywords on sites such as Moz, Quora, Reddit etc where the high DA will get them to rank for the search term? The results on a SEO site or Q&A site won't be relevant and so I'd starve my competitors of some of their leads. Of course I'm not sure the effort would be worth it but would it work? (and no, none of my long tail keywords are included in this post)
Technical SEO | | Zippy-Bungle3 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Searching on root domain words = ranking on > page 10 in SERP
Hello, Our website wingmancondoms.com (a new condom brand) is not ranking in Google on the keywords "wingman condom", and I don't know why. In Yahoo and Bing everything is allright. I saw on this forum that it is maybe best to change my language URL's to wingmancondoms.com/nl /de and /fr instead of a direct URL like http://www.wingmancondoms.com/wingman-kondome (german translation). But is this our problem or are there more problems. Google is indexing our page well, no errors etc. Any other possibilities?
Technical SEO | | jogo0 -
How to verify a page-by-page level 301 redirect was done correctly?
Hello, I told some tech guys to do a page-by-page relevant 301 redirect (as talked about in Matt Cutts video https://www.youtube.com/watch?v=r1lVPrYoBkA) when a company wanted to move to a new domain when their site was getting redesigned. I found out they did a 302 redirect on accident and had to fix that, so now I don't trust they did the page-by-page relevant redirect. I have a feeling they just redirected all of the pages on the old domain to the homepage of the new domain. How could I confirm this suspicion? I run the old domain through screaming frog and it only shows 1 URL - the homepage. Does that mean they took all of the pages on the old domain offline? Thanks!
Technical SEO | | EvolveCreative0 -
Killing Page Rank flow
Another SEO has told a friend to nofollow certain internal links i.e. to their own website. There are no hard feelings but this sounds like nonsense to me. Firstly, I'm sure Matt Cutts said that the link juice will not be redistributed amongst the other links (can't find the post - does anyone have the URL or confirmation?). Secondly (and this is obvious) those pages which have links to other pages in the website will have no link juice to pass back, resulting in a lower "total sum" and the inability for that PR to flow back. In short it seems silly. Any thoughts would be interesting to hear.
Technical SEO | | IPROdigital0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0