Meta Data Question
-
Hi There,
I am working on the umbraco CMS and we have a Menu page which sits under one page on the CMS.
When accessing this page on the front end and navigating between the food menu / drinks menu, the url changes depending on which content you are on, however i have only one place to input a meta title and description meaning that it is seeing them as duplicate content as both the drinks menu url and food menu url are showing the same meta data.
Hopefully this makes sense, does anyone have anything similair where a url change happens when content within the page changes.
-
-
can you send me an example (inbox) so I can understand better
-
Hi Roman,
Thanks for getting back with the response,
I'm not sure if i was clear in my first post,
The CMS page is static and does not change but the menu content module within it is virtual content,
So depending on which menu is clicked on the menu module, the url will change however it is still seen as the same CMS page in the back end of the website.
-
Basically, you are talking about just one page (I mean one URL) with 2 different contents the food menu and drinks menu. So the answer to your question depends on what you have and what you want but mostly what your users do to land
In an ideal scenario, you will need to have 2 different pages one for the food menu and one for the drinks menu. But that is not the relevant point. What you want or what I can tell you is not relevant talking form the SEO perspective.
What is your audience behavior?
No matter if you are a big company or a small restaurant, or if are using Wordpress, Joomla or Drupal the first question you need to answer
is what your user do to reach your website or your competitor's website, probably they are the same.If you made a research and found that there are 10 users a month that asking to google for your menu, your discounts or even your offers, well you have your answer. Let's take this example you have a restaurant called tacos top and your website is tacostop.com
Let's take the first scenario, you will use a single page for every menu and sub-menu page
tacostop.com
--tacostop.com/menu
----tacostop.com/menu/food
----tacostop.com/menu/drinks
----tacostop.com/menu/wines
----tacostop.com/menu/coffesThe tacostop.com/menu will be like a category page is going to be one of the flagship pages. You can include your food, drinks, desserts, coffees and so on. Inside of this page, you need to include a link to every specific sub-menu and all these 3 level subcategories need to point (link) to his parent page in this way you are telling Google, hey this page is important to me if some ask for my menu show this one.
This page needs to be optimized for a keyword like "taco stop menu" and the secondary pages need to be optimized for longtail keywords all the keywords need to be optimized based on a Local Strategy and your Audience. If you don't have enough data on your Google Analytics or your Search Console.
You should research your competition or your local audience.Let's talk about the second scenario, you will use a single page for all the content. In this case, you will need to optimize a single page for multiples keywords and this will be you structure
tacostop.com
--tacostop.com/menu
----tacostop.com/menu#food
----tacostop.com/menu#drinks
----tacostop.com/menu#wines
----tacostop.com/menu#coffesAs you can see all the categories are on the page as a section a good example of that is Wikipedia
So your headers and its structure play a relevant roll rol on this scaneario.H1 ---> Your main keyword "Tacos Top Menu"
H2 ---> Tacos Top Foods
H3 ---> Pastas
H3 ---> Meats
H3 --->Seafood
H2 ---> Tacos Top Drinks
H3 ---> Orange Juice
H3 ---> Lemon Juice
H3 ---> WaterSo in this way, every category will be a section on this page and every section need to be optimized for its main keywords If use Moz track the single performance for those keywords is very easy
I hope this explanation can help you
If my answer were useful don't forget to mark it as a good answer
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Enabling Podcast for Search / Structured Data
Hi, I'm trying to configure a podcast to show up in search using these guidelines and need help identifying which code to use per these instructions. https://developers.google.com/search/docs/data-types/podcast Out of this, we put the following code inside the header tag of a designated podcast page, and it doesn't seem to be rendering properly when I test it. For this podcast home page: https://www.thepitchqueen.com/podcast-success-unfiltered/ href="http://successunfiltered.libsyn.com/rss"/> <title>Success Unfiltered Podcast</title>Any ideas about what to do? Or if this is correct, let me know?
Technical SEO | | HiddenPeak0 -
Spammy Structured Data Markup Removal
Hi There, I'm in a weird situation and I am wondering if you can help me. Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues. We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened: a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site. b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore. We need a solution for getting the markup out of the SERP. We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages. Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me? Thanks so much
Technical SEO | | Joseph-Green-SEO0 -
Webmaster tools question
Hi all. I have a question regarding http vs https. I have an https site and was wondering how to tell google in Webmaster tools to combine and use https. I have setup all sites in Webmaster tools. Both www and non www for both http and https. I see where to set up the www vs the non www but don't quite understand how to do the https part. I want all traffic to: https://www-creative -technology-solutions.com Thanks
Technical SEO | | twoacejr0 -
Basic Redirection Question
I am doing a 301 Redirect from site ABC to site XYZ. I loaded the following .htaccess file by ftp to the ABC.com/ server: Redirect 301 / http://XYZ.com/ This was completed over 30 days ago, OSE is not showing any of the links and is failing to show that abc.com is redirected even though the MozBar shows a successful 301 http status code. Is this still just a waiting game or is it not advised to do a redirect this way for seo? PS: ahrefs is showing the redirect itself, however, it is not showing the links going to site ABC.com/ as passing to site XYZ.com/ . Any help is appreciated.
Technical SEO | | Vspeed0 -
Very strange: META descriptions not showing
Hello, Since Panda 4.0 has been launched, all of my optimized META description have been gone in Google.
Technical SEO | | MarcelMoz
A while ago, I posted a question about this problem here: http://moz.com/community/q/all-meta-descriptions-gone. I know about Google's own will to decide which META description will be shown. And also about unique content of the descriptions. All pages did have an optimized description before Panda 4.0 and there were no troubles at all, what tells me there is something else going on. I tested some things: Rewrote 50 descriptions to very uinique ones, only five got indexed. This tells me that duplicate content of the descriptions is not the problem (they have never been 100% duplicate, product type was a variable which was always different for each page). Removed cache in GWT and fetched again as Google, didn't help. I checked the pages I tested and they all have been indexed again without showing the optimized descriptions. More information: The first time I changed some META descriptions and fetched the pages again in GWT, Google picked up my new META descriptions and showed them. A few days later, most of them disappeared again (so Google is aware of the description but seems to ignore it). Some pages show the optimized description when I change my search query (only a few, mostly the optimized description never got shown) Technique is ok. Source code shows the right optimized description. META robots isn't blocking anything except NOODP/NOYDIR (always has blocked those). Websites using the exact same CMS, website template, META descriptions (style and build-up), do not have these problems I compared elements like place of description in source code, usage of meta robots, og:description, crawl-delay in robots.txt, and special characters in descriptions between websites that are showing optimized vs. website that don't show optimized descriptions. I can't find any connection. Something I noticed is a change is my Robots.txt file: my webmaster has added the following command:
Crawl-delay: 2 May this have to do with my problem? I guess it doens't. I did some research and there are more websites that are suffering this problem beside mine. This tells me it must be Google (and so Panda 4.0) that is responsible for this change. I realy want my optimized descriptions back. Does anybody have an idea what to do?
Thanks in advance. Marcel0 -
Sub-domains for keyword targeting? (specific example question)
Hey everyone, I have a question I believe is interesting and may help others as well. Our competitor heavily (over 100-200) uses sub-domains to rank in the search engines... and is doing quite well. What's strange, however, is that all of these sub-domains are just archives -- they're 100% duplicate content! An example can be seen here where they just have a bunch of relevant posts archived with excerpts. How is this ranking so well? Many of them are top 5 for keywords in the 100k+ range. In fact their #1 source of traffic is SEO for many of the pages. As an added question: is this effective if you were to actually have a quality/non-duplicate page? Thanks! Loving this community.
Technical SEO | | naturalsociety0 -
URL rewrite question
I have adjusted a setting in my CMS and the URL's have changed from http://www.ensorbuilding.com/section.php/43/1/firestone-epdm-rubbercover-flat-roofing to http://www.ensorbuilding.com/section/43/1/firestone-epdm-rubbercover-flat-roofing This has changed all the URL's on the website not just this example. As you can see , the .php extension has now been removed but people can still access the .php version of the page. What I want is a site-wide 301 redirect but can not figure out how to implement it? Any help is appreciated 🙂 Thanks
Technical SEO | | danielmckay70 -
Robots.txt and robots meta
I have an odd situation. I have a CMS that has a global robots.txt which has the generic User-Agent: *
Technical SEO | | Highland
Allow: / I also have one CMS site that needs to not be indexed ever. I've read in various pages (like http://www.jesterwebster.com/robots-txt-vs-meta-tag-which-has-precedence/22 ) that robots.txt always wins over meta, but I have also read that robots.txt indicates spiderability whereas meta can control indexation. I just want the site to not be indexed. Can I leave the robots.txt as is and still put NOINDEX in the robots meta?0