Hi Ali,
Yeah, I would agree that it is a spammy site. When I tried visiting it, I went through a lot of redirects and ended up on some random page. I would disavow the domain and still implement 301 redirects.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi Ali,
Yeah, I would agree that it is a spammy site. When I tried visiting it, I went through a lot of redirects and ended up on some random page. I would disavow the domain and still implement 301 redirects.
Hi,
A spam score of 20%-40% doesn't necessarily mean that the backlink is bad/spammy. It's only an indication - you can read more about MOZ's spam score calculation here.
I would strongly advise implementing 301 redirects because the old URL can have more valuable backlinks that you still can't see. As good as MOZ's crawler is, it won't pick up everything Google does.
If you're worried about a certain link being spammy, I would go visit it and see what kind of website/page it's sitting on and if it does turn out to be pure spam, you can Disavow that link from Google Search Console. Otherwise, I'd keep the link.
Hope that helps!
Hi Daniele,
I think your homepage should have some text that describes what the website is about and what services/products do you offer. A great place to get ideas would be for you to see the top ranking websites for the keywords you want to rank for. Take a look at their homepages, what kind of content they've put up and how they broke down their homepage. That should give you a few solid ideas to build on.
Hope that helps!
I'd recommend you install it and try it out. We've been using it for a few sites and its great
Doing a "site:" search for your website shows that there are over 90k pages sitting in the index. I'm not sure how accurate that is, but at least the site is properly indexed.
I'm not sure what you mean by "how can i change my content" - this is a WordPress site, so I'm assuming adding/changing content isn't a difficult task since you have what appears to be a well-built site.
There are a few very good ones. I can share the ones I know and used:
RankMath is relatively new and offers a lot of features for free that are found in the premium version of the others.
There could be many reasons for this happening.
Did the website go through a recent update? Have you changed the site's content? Did something change on the robots.txt file? I'd also check if the pages suddenly changed to "noindex" or "nofollow".
I'd recommend you run an audit and check all technical issues first
Is this a brand new domain or did you buy a previous domain name and built a new website on top of it? If the domain had previously existed, look into the backlink profile and what kind of sites are pointing to your website.
Another possibility is the theme you've used for the website. Have you scanned/checked the theme before implementation?
That's highly dependent on the query itself. For example, for research-based searches, you'll see authoritative websites like Wikipedia dominate the search results as those kinds of searches are not relevant to any location. Think of a medical term, or a name of a movie, or a type of flower for example.
However, for navigation-based searches, and local services based searches, you'll see very localized results. If you search for "plumber in CityName" - you'll see different results every time you use a different city name.
It comes down to the search intent as Google always aims to serve that intent to the best of its ability & understanding.
I would recommend having 2 sitemaps - one for the US store and the other for the EU store. However, technically one sitemap will be fine. The reason I would recommend 2 different ones is for you to be able to see how many URLs from each sitemap are being indexed vs how many are being ignored - this will allow you to further investigate any possible issues.
That being said, it sounds like you have a different issue since Google is choosing different canonicals than the ones you are declaring. This would probably be because the US and EU pages are almost identical and Google doesn't see a need to index both. Have you implemented HREFLANG across your website to target the different regions? I would also look into the on-page elements & content for each page to make them more relevant to their targeted regions. Things like Titles, Headers, descriptive text, etc should be different in both sets of pages depending on the target region of each page.
Hope that helps.
Agreed with what Gaston explained and would also add that you should check all elements on your website when it comes to what GSC is reporting. For example, it could be that you have an image ranking for that keyword in position 15 on the Image SERPs and a page from your website is actually ranking on position 50.
You can run a "site:yourwebsite.com" search on google and see what is returned. Based on the results you can run further "site:" searches along with more parameters to see more URLs in the index.
A general "site:yourwebsite.com" idea about how many pages are in the index, but it's not 100% accurate.
Where are you starting your crawl or check from?
If you are staring from http://www.example-link and you have SSL configured, then naturally you will be redirected to https://www.example-link - if that's the case, you can reconfigure your tools to start from the https version of your website and see what's happening then.
You should also check all your internal links and see if any of them are pointing to the http version and then update them all to the https version. This should sort out your http to https chains.