Is this good or not ?
-
Hello,
when I type this: site:majordroid.com in Google search
The result is in the photo , please see.
My question is: Is this good or bad ? And if is bad how can I remove it from Google search?
CMS is Wordpress and I use Yoast SEO....
my site is: http://www.majordroid.com/
Thank you
-
Hi Ivan, do you have an update on this?
-
Thank you
I will try this and let you know what happens next.
-
After some research made, I have found the following code, which could be implemented in the theme's functions.php:
function rel_next_prev(){global $paged;if( get_previous_posts_link()){?>
}if( get_next_posts_link()){?>
}}
add_action('wp_head','rel_next_prev');?>Try it out, and let me know when it is live. We can then check if it works well or not.
Gr. Keszi
-
ok. and what code I need to input ?
thank you
-
Ivan, correct that homepage is a "static page", which has a custom template in your theme directory.
That template has a loop which is going to show your latest articles on the homepage. That's part of the file you need to edit.
Gr., Keszi
-
ok. and what code does this template needs if you can tell me ?
and I can say that is a static page if this helps.
thank you
-
So the question really is, what kind of page template does the homepage use in order to pull in the data. We would need to modify that in order to put in the pagination.
P.S. Sorry for the late reply.
-
ok. can you tell me what code I need to have ?
thank you
-
I am not 100% sure. For that I would need to have a look at the php code.
For sure, now it is not implemented.
-
So for the homepage I need manually edit the code? Yoast plugin cant do that ?
thank you
-
Usually Wordpress SEO by Yoast implements this automatically when we are talking about category/tag/date/etc. archive pages.
In case of the homepage, I do not know personally where to implement it - I'd need to see the code for it.
-
Thank you all
I was thinking about page 1, page 2, page 3 etc....
So, what I need to to remove this ?
I need to edit my WP theme ?
-
I believe what you are missing is the pagination markup link rel="prev"/rel="next". This is why all the paginated content is indexed, although you have a canonical link set up.
Gr., Keszi
-
How do you mean 'is this bad' - what aspect?
It's not great that your meta description is the same on each page. It should describe the page, not be generic to the whole site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our Journey back to Good Rankings.
17 year old support site on the topic of hair loss. The home page (and pretty much all internal pages) enjoyed Page 1 Place 1 ranking out of 64 million search results for 12 of those years, for our main search phrase: hair loss. Other internal pages ranked #1 for other search phrases. I believe we were blessed by Google because we did everything the best we could: Genuine, manually constructed, unique, relevant content that was created from the heart. Other generalized health sites linked-to our site for more information on hair loss, and we had a couple thousand back-links that we never had to pay for. For the last 7 years or so, core content and news center went stagnant, but user-driven content (discussion forums) continued chugging along. Very old CMS systems had created duplicate content (print pages, PDF pages, share pages) and the site was not mobile-friendly at all. By the end of 2013, our home page had been bumped to the middle of Page 2 for "hair loss" as Google began pushing us down. Replacing our 700 page site dedicated to the topic of hair loss with random news articles, and dermatology organization sites that had little more than a paragraph of content on the topic. Traffic and income dropped by over 75% with this change, and by 2015 we were looking at a 9 year old site design that wasn't mobile-friendly, and had no updated content outside of the Forums for about as long. Mid 2015 we began a frantic renovation. The store was converted to a mobile-friendly design, tossed into HTTPS, and our developer screwed up, forgetting to put canonicals in place. Soon after, our store rankings dropped to almost zero. By the end of 2015 this was fixed, and we were spending tens of thousands to convert a very large, very old site into WordPress with a responsive, mobile friendly, lightning fast page-load design. We had no Google Analytics data prior to this either. Actions Taken starting Jan 1, 2016 - May 2016: Static Homepage + core content > Now put into WordPress. (80 pages) - proper 301's. News section running a 10 year old "PostNuke" CMS > Now put into WordPress. (300 pages). 301's. Forums running a 5 year old vBulletin > Now put into XenForo. (160,000 pages). 301's. Profiles section running a 10 year old "SocialEngine" CMS > Now put into new SocialEngine. (10,000 pages)* Site moved from HTTP > HTTPS. Proper 301's. Store CMS already finished months prior but sales dropped by 90%. Almost zero. Old forum CMS had created countless duplicate URLs. All of these 410'd. Old forum CMS had 65,000 pointless member profile pages indexed. All 410'd. Old news CMS created 4+ dup pages for every article (print, etc). All 301'd to new Article URL. Our HTACCESS file is thousands of lines long, trying to clean everything up, and redirect everything back to one, accurate, proper URL for each piece of content. It was a lot of work! After 17 years, we obviously had spammy sites linking to us. I quickly deleted content on my site the worst offenders were linking to. Then hired an SEO person to create a disavow audit on the other 20,000 sites liking to us. He settled on around 300 URLs needing disavow, but commented that didn't see any evidence we'd been penalized by Panda. He finished Friday and we will submit disavow Monday. Ran Screaming Frog audit on the site Cleaned up Google Search Console fully Created properties and submitted new sitemaps there. Monitored each property for the last 3 months and addressed 100% of issues raised. Revived Facebook, Twitter, Google+, Pinterest, and Instagram Accounts. Began publishing new content in our /news/ section and cross-posting to Social Media. Began improving up our Title Tags in the Forums as they often were pointless: "Hi! Need help!?" **Despite this, nothing has helped. Nothing has budged. Our traffic hasn't moved an inch since January. Sales have dropped 90% and site income has almost dried up. ** I have taken out a $25,000 personal loan just to cover my mortgage and pay my bills while I attempt to identify what's going wrong, and how to fix it. It bought me about 3 months, and that 3 months is almost up. I hired 2 or 3 different SEO experts with varying levels of experience. Due to no Google Analytics data to draw on, none of them could come up with any specific explanations for our drop in ranking over the last 4 years. That's why I took the approach to just "do everything" to fix all problems identified, and then cross my fingers. It hasn't worked. As of today our home page is not even found in google for our main search phrase: hair loss. Its simply not there. At all. And the only thing that is ranking is our forums, ranked at "67", which is horrible. But I don't understand why a site that was doing so well for over a decade has now been completely dropped from Google, without a single notice in Console or otherwise, explaining any problems. I realize this is a massive undertaking, and an equally massive post. But any time you can spend helping me will be forever appreciated.
Algorithm Updates | | HLTalk0 -
Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
Hello friends, One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below: http://chakracentral.com/#panelBlock4 (service page)
Algorithm Updates | | chakraseo
http://chakracentral.com/#panelBlock3 (about-us page) I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it. Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below: http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples So please find my queries below for which I need help: 1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website? Regards,
Sarmad Javed0 -
Relevant Link, but Low DA...good idea?
If a website has a low DA (not because of spam. Just because it's new or because there isn't a ton of content) but it is industry specific/relevant, then is that worth pursuing? I have read how relevancy is supposed to be a major portion determining a link's benefit, but I"m leery about about something with a low DA - like under 15 low. Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Dofollow Links on Press Releases: Good or Bad?
Hello, I know that Google says that you are supposed to make anchored text links nofollow on press releases, but what about just putting the site url itself (example.com) and making it dofollow? Is that okay?
Algorithm Updates | | WebServiceConsulting.com0 -
Are links from directories still good practice?
Ok, so I am new at "link building"....which of course I have read furiously on how that philosophy is changed, it's a goal, not so much a process. I am focusing on great content, social sharing, etc. BUT, I see competitors still getting links from some of the directories that I have found listed on Moz as being "good" directories to list in. For example, yelllow pages, manta, ibegin, hot frog, etc. Do I have the terminology totally twisted here? Is it still good practice to get a couple links from these directories. Or is this practice completely the wrong thing to do post Panda & Penquin. Thanks guys!
Algorithm Updates | | cschwartzel0 -
Is Google Rotating Good Matches?
I have a theory that Google may be trying to be fair to white-hat-seo sites that are doing the right things with blogging, linking, social media, etc. [ie that deserve equal good positioning] are being cycled to and from the first page, perhaps in a weekly or monthly basis. My theory would be that they are purposefully doing it to give those sites more equal exposure. My case: I've had top rankings for http://thedogbitelawyer.com for almost all of the important terms for dog bite lawyers for a couple of years now. When Penguin came out we lost some ground across the board, and identified that perhaps there was too much duplicate content left over from when I inherited the site. I reworked the site wording and link structure a bit and gained back positioning. Since that time we are up and down like a yo-yo on the top terms! Anybody else have this suspicion? If it's true, I don't need to stress, if we are bouncing around for other reason's I'd better keep stressing!
Algorithm Updates | | JCDenver0 -
How to write a good resourceful SEO enabled article
We have our saas based website - most of our online customers are those who keep coming back to us and my GA is full of their footprints. I completely want to concentrate on getting hold of those who might really need our software and as of now are not able to find them . Including keywords through which people might want to find us is one of the ways. Next how do I publish that to the majority of the users to find and get traction better on that article or post? Would posting links to facebook twitter etc and getting people to find those articles there and link back and come on our main website to read it - will this help? We sell cloud based software but have various domains where our customers can make use of it. There are at least 5-10 of them. We don't have content at all on our website. In a few simple steps how can I get started with this - Content generation **Linking back the content ** Generating good foot falls from users to those cotent Notching up on google for those content page A detailed insight would prove much helpful Thanks
Algorithm Updates | | shanky11 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0