Configure parameter effect in google wmt to reduce overly dynamic urls
-
We are looking at a weatherforecast site with realtime information that is updated every 5 minutes. For this website many urls have 6 parameters
The SEOmoz campagne found duplicate information and overly dynamic urls. Then we went to google wmt section url parameters and configured parameters like day, month, year (effect: none).
The next weekly SEOmoz campagne showed a big reduction in duplicates and small reduction overly dynamic urls.
How can we reduce these 'errors' further?
-
Thx for the great advice. We have a lot to do.
-
Good question! Unfortunately, the Moz web app isn't as sophisticated as Google in ignoring superfluous parameters, but there are some ways to deal with these errors.
In your situation I would consider implementing canonical tags on URLs with different parameters. This only makes sense if the pages actually are canonical, near-duplicates of each other where the only thing that changes is small amounts of updating weather information.
This might actually improve your search engine rankings, as well.
For the Moz crawl report, it will ignore duplicate title and content warnings for pages that have correct canonical tags. (you will see an increase of the number of canonical in your report, but this is just a notice)
Hope this helps. Best of luck with your SEO!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved How to reduce the spam score for my domain?
My domain longfeifei.com is for a regular company website and someone sent a lot of external links on different low-quality websites on the internet. Now the score is very high about 75%. If I disavow the unusual links from google search console. Is it possible to reduce the spam score? Is MOZ associated with GOOGLE Data? Thanks,
Moz Pro | | niaokun6838072 -
Links>TopPages: Mostly image URLs: Good for SEO? Redirect obsolete ones??
Looking at my "Links > Top Pages" report shows the top fifteen being image URLs (jpg files). Weird or normal? Clicking the "View Links" icon for a few of the image URL's shows that the best links in the list have DA in the 30's and 40's, and some PA in the 20's. But the linking sites have names that SOUND pretty crappy (from sites with names that sound weird or spammy) QUESTION 1: Do links to image URLs... A) help the overall SEO of the site's HTML pages, B) only help the specific page the image appears on, C) or are they of no value to the regular pages? QUESTION 2: What to do with the obsolete images? (images still hosted on our site, but do not appear on any pages) If I remove the images, the links will be lost. Which is good if the links are hurting us, but it would be bad for our SEO if the links are helping (unless we 301 the obsolete images to current ones) Thanks!
Moz Pro | | GregB1230 -
Why might Google be crawling via old sitemap, when the new one has been submitted and verified?
We have recently relaunched Scoutzie.com and re-submitted our new sitemap to Google. When I look on Webmaster tools, our new sitemap has been submitted just fine, but at the same time, Google is finding a lot of 404s when crawling the site. My understanding, it is still using crawling the old links, which do not exists. How can I tell Google to refresh it's index and to stop looking at all the old links?
Moz Pro | | scoutzie0 -
How can I see the URL's affected in Seomoz Crawl when Notices increase
Hi, When Seomoz crawled my site, my notices increased by 255. How can I only these affected urls ? thanks Sarah
Moz Pro | | SarahCollins0 -
Exporting Google and Bing Search Results
Is there away to get a spreadsheet of the pages indexed for a certain domain in google and bing? i.e. I search google for Site:www.domain.com and I want to export a .csv file of all those domains/pages. Cheers
Moz Pro | | JohnW-UK0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Need assistance with tool development using SEOMoz & Google APIs
I don't know where the right place to ask this is, but I work for an SEO agency, and we are looking for someone to help us with development of some tools utilizing the SEOMoz API and probably the Google API as well. Does anyone know where I could find a person with some previous experience with development using these APIs? We've had poor luck just trying to use a developer without specific experience in this area or at least some SEO knowledge. If you're a developer and would like to talk with us, you can contact me directly if you prefer, rather than post your contact info publicly, but I welcome any helpful thoughts or ideas regarding development of SEO tools. Thanks,
Moz Pro | | BandLeader
John
jmaher [at] mcdia.com0 -
Duplicate page titles are the same URL listed twice
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
Moz Pro | | loopyal0