Questions created by akin67
-
URL Best Practices (for site with millions of records)
We have a company information database where we also list companies located in a specific building/property. We plan to add more information to these pages basically to create a Property/building database. From an SEO standpoint, what is the best URL structure? Please note we will have information on millions of properties, across the US. Currently this is an example of what we are displaying http://www.buzzfile.com/Lists/Companies-located-at-45-Broadway,-New-York,-NY,-10006/6959468 When we change this, we plan to also migrate to https. For the above page, we are considering changing it to something like the following: 1 - https://www.buzzfile.com/Property/45-Broadway,-New-York,-NY,-10006/6959468 2 https://www.buzzfile.com/Property/6959468 Most users will search for these pages by address. What do you recommend we change our URLs to in order to get maximum SEO lift? Thank you,
Local Listings | | akin671 -
Was Google Analytics and Adsense Down Today?
For the last 4 hours of so we were registering zero users and Adsense reporting has not changed. We checked the site, and there were no problems. It seems for some reason there was no reporting. Just now it came back up and we are showing live traffic. Trying to figure out if this was a problem specific to us or if it is on Google's end. Thanks,
Reporting & Analytics | | akin670 -
Can you confirm legitimate Google Bot traffic?
We use Cloudflare as a firewall. I noticed a significant number of blocks of bot traffic. One of the things they do is try to block bad bot traffic. But it seems they are mistakenly blocking Google Bot traffic. If you use Cloudflare, you may want to look into this as well. Also, can you confirm if the following IPs are for legitimate Google Bots? 66.249.79.88
Technical SEO | | akin67
66.249.79.65
66.249.79.80 66.249.79.76 Thanks,1 -
Help recover lost traffic (70%) from robots.txt error.
Our site is a company information site with 15 million indexed pages (mostly company profiles). Recently we had an issue with a server that we replaced, and in the processes mistakenly copied the robots.txt block from the staging server to a live server. By the time we realized the error, we lost 2/3 of our indexed pages and a comparable amount of traffic. Apparently this error took place on 4/7/19, and was corrected two weeks later. We have submitted new sitemaps to Google and asked them to validate the fix approximately a week ago. Given the close to 10 million pages that need to be validated, so far we have not seen any meaningful change. Will we ever get this traffic back? How long will it take? Any assistance will be greatly appreciated. On another note, these indexed pages were never migrated to SSL for fear of losing traffic. If we have already lost the traffic and/or if it is going to take a long time to recover, should we migrate these pages to SSL? Thanks,
On-Page Optimization | | akin671