Wikipedia is one of those sites that doesnt have much of SEO optimization.
Another example is the underscore in their URLs..
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by GastonRiera
-
RE: Why doesn't the Wikipedia homepage have meta tags?
-
RE: Ways to fetch search analytics - historical search query data from Google Search Console
Hi there,
-
The only way to do that is saving every time the current time. There are some ways to do that automatically with programs, but arent officials.
-
No, it doesnt
If what you are trying to recover is information that you do not have and is older than 90 days, there is no way to recover that now.
My advise is to start saving the reports that you are interested inHope it helps.
GR. -
-
RE: 301 Redirect or landing page
We all agree that you should get rid of those 302's .
Then, Im kind of getting the idea... So, you're saying the possible solution: Turn the category pages into landing pages.
It's logical to me too. Of course making that move, will need well organization and a bit of preparation, so as nothing gets lost.
-
RE: 301 Redirect or landing page
Hi lee,
Both pages show the exact same information?
If yes, then just add a canonical declaring the correct page to index and keep updated the sitemap.xml with the correct URLs.
Here some info about those:Canonicalization - Moz
XML Sitemaps: Guidelines on Their Use - MozIf no, then I don't fully understand your problem.
Hope it helps.
GR. -
RE: 301 redirect to URL plus anchor tag???
Hi there,
On one hand, If the content on both landing pages, you **must **set up the 301. Because you are having duplicate content.
On the other hand, keeping all your landing pages in a very long landing page.. I don't see that's a good idea.
even though you are differentiating with #keyword that is not taken as a different URL.Let me explain a little more:
site.com/product/car1
site.com/product/car2
These two are different pages. Google takes them as different pages and are very suitable to rank for different keywords.site.com/product#car1
site.com/product#car2
These two are the same page. The rankeable capacacity to rank on both keywords is diminished.Is it clear?
-
RE: How long to re-index a page after being blocked
Hi Andy,
In my experience, it took from a day to roughly 5-6 weeks. All of them just naturally re-indexed, I didn't use any gas indexer nor adding them with the search console add to index tool.
Hope it helps.
GR.
-
RE: Heading Tags (Specifically H2) being used within images
Hi there.
Why are you willing to add h2 tags? I its on seo purposes, add them as a REAL HEADING. Not has images.
My way of analyzing these kind of questions is asking how would Google interpret that action. I don't find any logical explanation to put an image in a heading tag.
Personally, I will not do that. But, someone might have it done and can offer a positive effect.
Best luck.
GR.
-
RE: How can I make a list of all URLs indexed by Google?
Im sorry to confirm you that google does not want to everyine know that they have in their index. We as SEOs complain about that.
Its hard to belive that you couldnt get all your pages with a scraper. (because it just searches and gets the SERPS)
-
RE: How can I make a list of all URLs indexed by Google?
Well, There are some scrapers that might do that job.
To do it the right way you will need proxies and a scraper.
My recommendation is Gscraper or Scrapebox and a list of (at list) 10 proxies.Then, just make a scrape whit the "site:mydomain.com" and see what you get.
(before buying proxies or any scraper, check if you get something like you want with the free stuff) -
RE: How can I make a list of all URLs indexed by Google?
Hi Sverre,
Have you tried Screaming Frog SEO Spider? Here a link to it: https://www.screamingfrog.co.uk/seo-spider/
It's really helpfull to crawl all the pages you have as accesible for spiders. You might need the premium version to crawl over 500 pages.
Also, have you checked for the common duplicate pages issues? Here a Moz tutorial: https://moz.com/learn/seo/duplicate-content
Hope it helps.
GR.