Updated site with new Url Structure - What Should I expect to happen ?. Also it's showing PR 1 for my urls on Opensite explorer
-
Hi All,
We updated our website with a new url structure. Apart from the root domain , everyother page is showing up in opensite explorer with a page rank 1. Although we only went live with this yesterday, I would have thought that the 301's etc from the old urls would be coming through and the PR would show ?..
I am not familiar what to expect or what alarms bells I need to watch out for when doing this type of thing although I would probably expect a small drop in traffic ?..I don;t know what the norm is though so Any advice greatly appreciated?
thanks
PEte
-
When you say page rank, do you mean Google's Page rank or do you mean Moz's page authority, from Open Site Explorer? If you mean Moz's page authority, that measure should go up, though I don't know how long that would take. You might need to wait till there is another crawl.
-
Many thanks Gazzerman1 .
Pete
-
Your 301's if setup correctly will work fairly quickly in Google and should see results for your main pages within days sometimes hours depending on the sites popularity/crawl rate.
pagerank has not been updated since last november according to John Muller at Google and will not ever be updated again. It is still an internal metic used by Google but the latest data is no longer available to the public. So your pages will likely never have visible page rank again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Meta descriptions in other languages than the page's content?
Hi guys, I need an opinion on the optimization of meta descriptions for a website available in 6 languages that faces the following situation: Main pages are translated in 6 languages, English being primary >> all clear here. BUT The News section includes articles only in English, that are displayed as such on all other language versions of the website. Example:
Local Website Optimization | | Andreea-M
website.com/en/news/article 1
website.com/de/neues/article 1
website.com/fr/nouvelles/article 1
etc. Because we don't have the budget right now to translate all content, I was wondering if I could add only the Meta Titles and Meta Descriptions in the specific languages (using Google Translate), while the content to remain in English. Would this be accepted as reasonable enough for Google, or would it affect the website ranking?
I'd like to avoid major mistakes, so I'm hoping someone here on this forum has a better idea of how to proceed in this case.0 -
Different variations of my site (www ; non www ; https ; http) have different authority status
In Open Site Explorer I can see that the www and non www versions of my site gets treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 38 PA 37 http://www.mydomain.com DA 28 PA 27. Currently our preferred variation is https://www.mydomain.com but it used o be http://mydomain.com for 3+ years and I can see that the non www version of my domain have more authority. Would you advise us to "change back" to setting our preferred url as the non www version as before or would it have a negative impact on our SERP ranking etc if we change it again now?
Local Website Optimization | | shaunn140 -
Is CNAME / URL flattening a bad practice?
I recently have moved a number of websites top a new server and have made the use of CNAME / URL flattening (I believe these are the same?). A network admin had said this is an unrecommended practice. From what I have read it seems flattening can be beneficial for site speed and SEO even if very little.
Local Website Optimization | | Dissident_SLC0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
SEO for local business directory type site
I am thinking about creating a local business directory type website that lists all local Tattoo Shops. I am familiar with both local and global SEO and how to differentiate between them, however, I am not sure how I should approach this type of website. It isn't an actual business, but I want to target local searches that are looking for tattoo shops. In other words, when someone types in "tattoo shops" or "tattoo shops near me", or "tattoo parlors", I want the website to appear. Is this something that is manageable, or will the individual Tattoo Shop websites always show before mine since they are real local businesses with google+ pages?
Local Website Optimization | | brfieger0 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
Reconsideration request failed - New website?
I am looking at website with MOZ PA 34. The website belong to a shop in Manhattan. Simple shop, simple man, not one that do tricks. Reconsideration request failed twice! Never happened to me in the past.. Google ignored some domains in the two disavow files we submitted. All of these domains are asking $ to remove links that as much as I know we didn't even bought 😞 My Question Can I create a brand new domain/website and transfer the PA juice WITHOUT the bad links?
Local Website Optimization | | Elchanan0