Generating a signature and expires in java
-
Hello,
I am developing a tool for my company to get stats from SeoMoz using your API. During development, I have been using the example signature and expires values which are auto-generated for me. Now that testing is complete, my code will need to generate these values. I have been googling looking for a resource demonstrating how to do this using Java, but I have not found a good example. I was hoping that someone at SeoMoz would have a resource or an example that they could share.
The email associated with this account belongs to a non-developer, so if a response is provided via email in addition to the forum, sending it to my email would be much appreciated.
Thank you,
Anthony
-
Never mind, I have come up with a solution:
package com.yourpackage.signature;
import java.io.IOException;
import java.security.InvalidKeyException;
import java.security.NoSuchAlgorithmException;
import java.util.Date;import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;import org.apache.geronimo.mail.util.Base64; //can be whichever flavor of encoder you'd like
public class SignatureGenerator {
public static final String ACCESS_ID = "member-XXXXXXX";
public static final String SECRET_KEY = "XXXXXXXXXXXXXXXXXXXXXXXXXXx";//expireTime should be in seconds since Jan 1 1970 : new Date().getTime()/1000) + X
public static String generateSignature(String data, String key, String expireTime, String algorithm)
throws InvalidKeyException, NoSuchAlgorithmException, IOException {data += expireTime;
byte[] hmacData = null;
SecretKeySpec secretKey = new SecretKeySpec(key.getBytes("UTF-8"),
algorithm);
Mac mac = Mac.getInstance(algorithm);
mac.init(secretKey);
hmacData = mac.doFinal(data.getBytes("UTF-8"));String encoded = new String(Base64.encode(hmacData));
return encoded;
}public static void main(String[] args) {
try {Long longTime = new Long(new Date().getTime()/1000) + 60;
System.out.println(longTime);
String data = ACCESS_ID + "\n";
System.out.println(generateSignature(data, SECRET_KEY, String.valueOf(longTime), "HMACSHA1"));
} catch (Exception e) {
e.printStackTrace();
}}
}
-
There has been no response from SeoMoz on this forum or to my email.
Please provide some feedback. I am afraid If I cannot solve this issue I will be forced to cancel our account as it is not practical for me to manually load the sample signature and expired value on a daily basis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap generator partially finding list of website URLs
Hi everyone, When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt. Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap. Thanks!
Technical SEO | | Taysir0 -
Text hidden by Java
We are developing a new website and intent to use Javascript to make the page more manageable on the screen with a show more / show less. For example http://www.tanzaniaodyssey1.com/tanzania/serengeti where a large amount of the content of the page is "hidden". from what i can see although google won't penalise us but am i right in saying that this content will not be indexed and will count for google ???? I need to make sure that this text is indexed but i also don't want to have horrendously long un-user friendly pages, **Does any one have a way round this ? (could i put the words in a <noscript tag="" ...<="" strong=""></p></noscript>**
Technical SEO | | EdD-DigitalPotion0 -
Generating a xml sitemap?
Hi What is everyone's preferred method of generating an XML sitemap? Just wondering if one piece of software is better than others?
Technical SEO | | TheZenAgency1 -
Looking at creating some auto-generated pages - duplicate content?
Hi Everyone! We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars). We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked. It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it? Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about? Thanks, JP
Technical SEO | | Matt.Carwow0 -
Google Impressions Drop Due to Expired SSL
Recently I noticed a huge drop in our clients Google Impressions via GWMT from 900 impressions to 70 overnight on October 30, 2012 and has remained this way for the entire month of November 2012. The SSL Cert had expired in mid October due to the notification message for renewal going to the SPAM folder and being missed. Is it possible for an SSL expiry to be related to this massive drop in daily impressions which in-turn has also effected traffic? I also can't see any evidence of duplicate pages (ie. https and http) being indexed but to be honest I'm not the one doing the SEO therefore haven't been tracking this. Thanks for your help! Chris
Technical SEO | | MeMediaSEO0 -
Expired Domain - http:// or www
I have an old domain - When i use the link explorer i get way more juice out of the www version of my domain. I will be using wordpress to set up a new domain with the same name . My question is - How do I make it proper for seo? Do i just change the http:// to www in wordpress and be done with it? Does it even matter (thinking it does)
Technical SEO | | imagatto20 -
Grabbing Expired Domains
How hard is it to grab expired domains? I have my eye on a domain that is expiring in 3 days, but I don't think it's quite that simple. Doesn't it go through months of waiting to become available? Is there an easy way to grab domains that are set to expire? Are the services that offer this type of service good? And who do you guys recommend?
Technical SEO | | applesofgold0 -
Expired traffic and 301 value
Hi Folks, Here is our situation we have an old brand domain www.asia-hotels.com >> that was redirecting to>>www.asiahotels.com By mistake, we let that domain expired and only noticed the drop a month later We lost all our pages and this for several weeks Not sure of the exact date but approximately around 24th of December, what a merry Xmas! 😞 Since then we have repurchased the domain, Put back all the pages as they were and re-instated all the 301 redirect as they were. Since that date we haven't seen any uplift in our visits or visibility score. Did we do something wrong with our 301 redirect? I know for sure we used ISAPI rewrite mod for the non www. domain although I am not entirely sure how the www. version has been handled. Is there something we should do at a DNS level to flag the site is back? Should we presetn a reconsideration request? Any help would be greatly welcomed. Thanks for your help. Cheers, Freddy More info I placed a bit more info and the visits graph on my blog: http://www.inmarketingwetrust.com.au/seo-effect-of-domain-expiry-on-301-redirects/ I am not sure if this is due to the fact that some information is cached but when i looked at the site on opensiteexplorer I found that the data is still showing as non redirected sites: http://www.opensiteexplorer.org/asia-hotels.com/www.asia-hotels.com/a!comparison effect-of-301-redirect-expired-on-SERP-visibility-300x204.jpg
Technical SEO | | Gus_Martin0