Set it manually on where it lives best, or programmatically let them pick the one that is most relevant. I would not implement any logic based on where users are coming from. You're likely well of by spending your engineering resources better than doing that for SEO.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by Martijn_Scheijbeler
-
RE: Best way to handle Breadcrumbs for Blog Posts in multiple categories?
-
RE: Brand name in title?
It could, I would say it's depending on how large of a brand you are in the space that you want to rank for. In tons of cases for bigger brands, it makes more sense to add their brand name to the title of the page in the SERP as it will have a higher awareness of people who are already familiar with it and that's why it'll increase their CTR. In most cases, I would say that it's better to test this assumption at a larger scale to see what the actual impact is of removing/adding the brand name.
-
RE: How is Moz DA affected by spam links? Disavow file?
Moz has its own ways of determining if a link/site might be potential spam. As they don't know the disavow data of websites that is something that is not taken into account when calculating Domain Authority. I do believe they take their own measures and calculate that against the Domain Authority/Page Authority metrics.
Also, at some point you need a higher quantity of links, building 10 links over a period of months usually isn't a great signal to search engines that your site needs a boost. If it's just for a specific page, then ignore my comment on it. But usually, I notice that sites need more attention than just building links.
-
RE: Trying to get Google to stop indexing an old site!
No worries, let us know if it changes anything.
-
RE: Trying to get Google to stop indexing an old site!
Hi Kirk,
Try pinging the URLs of these old pages to Google (http://www.google.com/ping?sitemap=URL/of/file), if you have a list of the pages on the old site that's something that I would try. What could be causing this is that these old pages were barely visited by the crawlers and because of that they're not being picked up yet as being redirected. Basically, by pinging them to Google (a bit of an oldskool technique) you can trigger a crawl of them and hopefully, this will help.
Martijn.
-
RE: Block session id URLs with robots.txt
Uhh, that's not what the requester is looking for and could actually cause tons of problems if you would apply this on a site that you're unaware of. I would always go with the most limiting robots.txt that you can and in this case, I would go with: /?filter=
-
RE: Where are these "phantom visitors" and are they dangerous?
In Real-time reports, Google Analytics will not always be able to pinpoint the location (and in this case, it could be that the users aren't domestic, but that's hard to guess from just that screenshot). I would go into your Google Analytics account: Audience > Geo and look at the country/region reports in there to see where your traffic is coming from.
I wouldn't block any of these users from your site, in the end, the only thing they could hurt at massive volume is site speed. But I doubt that's an issue for you if you're at 19 visitors in real-time. Overall, there shouldn't be any dangers, and even if this data is not correct, you can always apply an Advanced Segment in Google Analytics to exclude these users if needed.
-
RE: Do bulk 301 redirects hurt seo value?
No, doing this at bulk shouldn't necessarily hurt them when you make sure that you're doing it the right way with the proper redirects in place. In the end, you see tons of sites that are going out of business or change domain names and move the data over to a new domain. So that's why your bulk action isn't such a big problem.
-
RE: Spotify XML Sitemap
Hi,
Can you clarify what you mean by: "none of the links are actually linked within the sitemaps"? Do you mean that some pages aren't part of the sitemaps?
Martijn.
-
RE: Thoughts on Botify?
Based on your response and the idea that you only have 50 pages that you actively care about. Likely it's not the right tool for you. You're probably better off with a tool that is way more focused on on-page optimization for the keywords that are important to your business. I would start worrying about crawl behavior on a site that is scalable and have over 10.000+ pages that regularly change.
-
RE: Thoughts on Botify?
I've used Botify at my previous job (Postmates) and I also brought it on at my current job at RVshare. I think it's a great tool that brings a lot of new things to the table that GA/GSC can't offer at all (and won't offer). If you're interested in learning how making changes to your site have a direct influence on crawl behavio\r by bots and how that can help you drive additional traffic then it's a great plus. That's why on the technical/product side it will open a lot of new opportunities for testing and continuous optimization.
Regarding if it's useful for you, I would be hesitant to buy an expensive tool (you're likely north of $1500 monthly) for a smaller site as there are fewer ways to optimize on that for the scale there is. But if the only 5K pages generate millions of users from Organic Search on a monthly basis and bring in a lot of money then Yes it might be worth it. If you're comparing to Moz/SEMrush etc. Don't they're not comparable and have totally different price points. You use Botify in addition to your current toolset, not as your toolset.
-
RE: How does changing sitemaps affect SEO
Hi Jason,
I wouldn't worry about changing this at all, in the end, the 50K limit that has been put on sitemap is an arbitrary one. So if you keep your sitemaps well under that it doesn't really change anything at all. In the end, the files itself are not a ranking factor, they're being used to become aware of URLs that don't exist on the site or for search engines to be notified of URLs that have been updated (through the last mod attribute). So changing it to 15K shouldn't harm you.
Martijn.
-
RE: Fetch as Google temporarily lifting a penalty?
Ok, that still doesn't mean that they're not personalized. But I'll skip on that part for now.
In the end, the changes that you're seeing aren't triggered by what you're doing with Fetch as Google. I'll leave it up to some others to see if they'll shine a light on the situation. -
RE: Fetch as Google temporarily lifting a penalty?
Hi,
I'm afraid I have to help you with this dream, there is no connection whatsoever between the rankings and the feature to Fetch as Google within Google Search Console. What likely is happening is that you're already getting personalized results and within a certain timeframe, the ads won't be shown as the results will be different as Google thinks that you've already seen the first results on the page the first time that you Googled this.
Fetch as Google doesn't provide any signal to the regular ranking engines to say: "Hey, we've fetched something new and now it's going to make an impact on this". Definitely not at the speed that you're describing (within seconds).
Martijn.
-
RE: Using 410 To Remove URLs Starting With Same Word
Hi,
Have you also excluded these pages from the robots.txt file so you can make sure that they're also not being crawled?
The code for the redirect looks something like this:RewriteEngine on
RewriteRule ^/mono* - [G,NC]Martijn.
-
RE: Is using REACT SEO friendly?
Hi Martin,
It can be, that's the actual answer. As React is using JavaScript to load its pages and load the content in most cases. Google and other search engines are able to read the content but it's always required in these cases to check what the actual result is. I've worked with many sites using React and it depends if they're using server or client-side rendering. Start there, to figure out what you can be using for your client/company. Some teams are really drawn to the client side rendering which is a little bit more dangerous as not always can Google see the actual content. In case of server-side rendering, I've seen it go well for most of these.
Let me know if you have any specific questions, happy to answer them!
Martijn.
-
RE: Remove Product & Category from URLS in Wordpress
Hi,
You should be able to just change this in the permalinks. It's good to be reminded of some of the things that you need to take care of: redirects, making sure that all old links are pointing the new right way. But most of all, I want to make sure that you're aware of the different internal structures in URL depth. That's probably why most people would advise you not to go this way.
Martijn.
-
RE: Is it best practice to have a canonical tags on all pages
Correct, I would usually advise adding in a self-referencing canonical tag to make it easier for audits and search engines to understand what the actual content is on the page.
-
RE: Robots.txt Tester - syntax not understood
Hi James,
The right syntax is:
Sitemap: https://www.pkeducation.co.uk/post-sitemap.xml
Sitemap: https://www.pkeducation.co.uk/sitemap_index.xmlWhen you retry it should show up as working.
Martijn.