Input on Experiment with Google
-
As I'm doing more research into Google's devaluing links, I can do nothing more but to wonder if we will be penalized for previous links (bad links).
Here is the situation: Our company was ranking very well for this particular keyword (within the top 3 positions on Google). However, in the last 6 months, we have seen rankings drop significantly (now to the point Google doesn't even recognize the existence of the page). With Google not recognizing us, we decided to do an experiment.
The experiment: Make another page with a different URL and delete the existing page that is not ranking in Google.
Our Experience: We have noticed that our pages will get indexed and ranked within weeks or making a new page.
Our Goal: To get ranked on Google
Will our new page get penalized from the old page if it's an entirely new URL?
Will the fact that Google in devaluing our links effect our new page that we are trying to get ranked?
Any insight would be of great value.
Thanks in advance
-
I think it's better to try identifying the bad links and disavow them with google instead of changing pages.
Creating new pages might solve the page problem at first, but what if the entire domain gets kicked out down the road?
Also, you might have good links pointing to your current pages, creating new replacement pages will make you loose the pagerank you might get from valid sources.
I would identify and fix once and for all the issues before it spreads.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
Hi community, Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach? Thanks
Algorithm Updates | | vtmoz0 -
Does Google ignores page title suffix?
Hi all, It's a common practice giving the "brand name" or "brand name & primary keyword" as suffix on EVERY page title. Well then it's just we are giving "primary keyword" across all pages and we expect "homepage" to rank better for that "primary keyword". Still Google ranks the pages accordingly? How Google handles it? The default suffix with primary keyword across all pages will be ignored or devalued by Google for ranking certain pages? Or by the ranking of website improves for "primary keyword" just because it has been added to all page titles?
Algorithm Updates | | vtmoz0 -
Google Trends Graph and KW Planner Monthly Searches?
I'm trying to show people the trends of certain keywords/topics over a period of years Keyword Planner gives some actual numbers but only for 12 months. Trends will show "Numbers represent search interest relative to the highest point on the chart. If at most 10% of searches for the given region and time frame were for "pizza," we'd consider this 100. This doesn't convey absolute search volume." Which I don't really understand, other than if the graph goes up it means more interest but has to do with the amount of people searching, location, etc which can get tricky? I'd like to put together a short report explaining certain topics and how interest in them has increased over the last 5+ years. I'm hoping someone else here has had some experience with this and has some advice or links with more information?
Algorithm Updates | | JoshBowers20120 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Google Panda Update - google.com.br ( brazil )
Hello folks, Someone know if google run their panda update in brazil ( www.google.com.br ), this week? Coz I can see a interesting boost in my google traffic sources. Thank you.
Algorithm Updates | | augustos0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0