Latest posts made by Andrew-SEO
-
RE: Readd/Reindex a page that was 410'd
It depends how many there are - if there are only a few then manually fetch them all in GSC and request a crawl one by one.
If there are many you could create a page with them all listed as hyperlinks and perform a fetch with using "Crawl this URL and its direct links"
They should be back in very soon. It's difficult to gauge the impact. If all internal linking and backlinks are still in place you there may be minimal damage if any.
-
RE: Topic Cluster: URL Best Practices
Have a page for the pillar topic and one for the sub topic if you have the word count to justify it.
Otherwise sub topics can be H2s
if you have two separate pages I would avoid keyword cannibalization were possible. Keep your keywords on the page that target them.
However it may be necessary to talk in the context of pillar content. If you do mention the Pillar keyword, make it a text link to the pillar page. That way google will attribute the use of that word to the pillar page when ranking - and it will in fact strengthen the pillar page.
Does that make sense?
-
RE: SEO effect of URL with subfolder versus parameters?
Thanks Miriam, This is very helpful and makes a lot of sense. What do you think of towns and villages, or boroughs of a large city. Do you think the close proximity is dangerous territory re: keyword permutations?
I take your point about unique content tailored to the people of the city - it makes a lot of sense. But what about locations that are closer to each other?
I know it's a tricky question but any insight would be most welcome.
-
RE: SEO effect of URL with subfolder versus parameters?
Hi
Thanks for your response I'm interested in this too. I've been targeting cities with their own pages but I head recently that google are going to be clamping down on multiple keyword permutations. Do you think cities will be counted in this?
-
RE: Any risks involved in removing a sub-domain from search index or completely taking down? Ranking impact?
If the sub domain targets keywords not targeted in the rest of the website then rankings will slip.
I would 301 all webpages to relevant pages on your main site. Any important keywords should be monitored. You should crelated pages with content from the sub domain to maintain these keywords.
If traffic is non existent just 301 them.
-
RE: Difference between urls and referring urls?
No. The referring URL is a page on your site that has a broken link on it. These are damaging your rankings so so fix ASAP. Go to all the referring pages and fix or remove the links with the URL in.
-
RE: Competitor getting External Links from search.aol.com
Yes this does not seem right. Did you check out the url the link is coming from. Is it in text with an anchor text. Or is it in a profile page?
-
RE: More pages on website better for SEO?
In my opinion, I would say less is more. If you have lots of pages with low page rank you will dilute your overall domain authority. Keep your content rich. Combine and cornerstone content where possible (into less pages), then amplify through social media.
Best posts made by Andrew-SEO
-
RE: Readd/Reindex a page that was 410'd
It depends how many there are - if there are only a few then manually fetch them all in GSC and request a crawl one by one.
If there are many you could create a page with them all listed as hyperlinks and perform a fetch with using "Crawl this URL and its direct links"
They should be back in very soon. It's difficult to gauge the impact. If all internal linking and backlinks are still in place you there may be minimal damage if any.
Extensive experience in SEO, web design, schema, amp, ppc, gtm. Yes I did just say all the latest shizzle. I cook a mean cottage pie.