Unsolved WP YoastSEO Plugin
-
We just updated our sites WP YoastSEO plugin one day prior to the Moz weekly crawl. Not sure if our change interrupted our weekly Moz crawl or if it was an issue with Moz. The Moz weekly crawl, weekly email notifications were not received and all Moz campaign reports were not updated. I would like to run a crawl immediately. How can I do so before waiting for the next MOZ crawl?
-
@dctomten Hi,
Thanks for reaching out!
My apologies for the delayed response. Can you pop us an email at help@moz.com or message us via chat in Moz Pro and we can look into this for you?
Have a lovely day,
Kerry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting Up Ecommerce Functionalty for the First Time
Morning Mozers!
Technical SEO | | CheapyPP
We are running up against a technical url structure issue with the addition of eCommerce pages . We are hoping you can point us in the right direction. We operate a printing company so all our current product info page are structured like: website/printing/business-cards
website/printing/rackcards
website/printing/etc The ecommerce functionality needs to go into a sub folder but the question is what should we name it? this how the urls would look like in the main category and product pages
/business-cards
/business-cards/full-uv-coaing-both-sides we were thinking either going with /order
website/order/business-cards
website/order/business-cards/full-uv-coaing-both-sides or maybe shop/ or/print-order/ etc Any ideas or suggestions?0 -
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
Unsolved Orphaned unwanted urls from the cms
Hi
Technical SEO | | MattHopkins
I am working on quite an old cms, and there are bunch of urls that don't make any sense.
https://www.trentfurniture.co.uk/products/all-outdoor-furniture/all-outdoor-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/all-chairs/1
https://www.trentfurniture.co.uk/products/all-industries/all-chairs/1
https://www.trentfurniture.co.uk/products/all-chairs/all-industries/1
https://www.trentfurniture.co.uk/products/all-chairs/banqueting-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/bar-furniture/1
https://www.trentfurniture.co.uk/products/all-chairs/bentwood-furniture/1
For example there are no internal links. And fortunately not much traffic at all. But I can't see in the cms why they are generating? I've tried to check the html code to check why, what's the reason? But all I can think of is the structure....? something odd the cms writes?
Anyone have any ideas please? And would I redirect all these? Just thinking there could be a better solution/fix, rather than redirects since there are no links or traffic.....Like the devs solve why they are generating.....Unfortunately I get very slow responses from the devs as a 3rd pty company, hence on here ;0). (Some of those are indexed too)... :0) Thanks in advance....0 -
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
How to boost the number of visitors to a specific page
Hello, Payday loan consolidation, debt settlement, credit card settlement, and debt consolidation are just a few of the pages I have. While walking through the search console dashboard, I noticed that while the position is improving, but impressions and traffic are decreasing, indicating a significant disparity. Please advise on how to resolve the issue.
Link Building | | OVLG0 -
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | | Michael_M2 -
Unsolved Almost every new page become Discovered - currently not indexed
Almost every new page that I create becomes Discovered - currently not indexed. It started a couple of months ago, before that all pages were indexed within a couple of weeks. Now there are pages that have not been indexed since the beginning of September. From a technical point of view, the pages are fine and acceptable for a Google bot. The pages are in the sitemap and have content. Basically, these are texts of 1000+ or 2000+ words. I've tried adding new content to pages and even transferring content to a new page with a different url. But in this way, I managed to index only a couple of pages. Has anyone encountered a similar problem?
Product Support | | roadlexx
Could it be that until September of this year, I hadn't added new content to the site for several months?
Please help, I am already losing my heart.0