iamconfused Posted January 28, 2010 Posted January 28, 2010 Im a information science student, so i really have no backroung knowledge in compter science. I understand the basic process behind digital signatures but thats as far as it goes. I have a presentation to give on friday. Ill copy and paste it here. If you can help, please do. Im at wits end The Digital Signature Standard is computed using the following equations: r = (gk mod p) (mod q) s = (h(M) − xr)/k (mod q) Describe what the various symbols represent. (b) Write down and explain the equation(s) used to verify a signature. © The standard specifies that r must lie strictly between 0 and q. What might go wrong if an implementation does not check this? (d) A designer decides to economise on code size by omitting the hash function computation, that is, replacing h(M) by M. What are the consequences of this optimisation?
bascule Posted January 28, 2010 Posted January 28, 2010 This sounds like it probably belongs in Homework Help
iamconfused Posted January 28, 2010 Author Posted January 28, 2010 This sounds like it probably belongs in Homework Help sorry my bad. ill post it there
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now