KayS Posted July 16, 2014 Posted July 16, 2014 Does anyone have any good resource on how to convert An analogue audio signal to a digital one. I came across this question and have no idea on how to solve it. much appreciated.Q. An analogue audio signal is converted to digital using sampling rate of 200hz with each sample using one of 64 different levels.How many bits are needed to encode ten seconds of sound(you should ignore the memory requirements for any metadata in your calculations)?Answers:a. 1,280 bitsb. 5,000 bitsc. 12,000 bitsd. 128,000 bitsThank you much appreciated... please dont block my thread I just am someone who wants to learn.
Strange Posted July 16, 2014 Posted July 16, 2014 Sounds like this may be homework/coursework related so I will just give you a clue Think about how many bits are needed to encode the 64 different levels for each sample. Now think about the number of samples gathered at 200Hz. Now do some multiplication ... 1
Sensei Posted July 16, 2014 Posted July 16, 2014 Thank you much appreciated... please dont block my thread I just am someone who wants to learn. Your other thread was blocked because you posted it twice. Only one thread about one subject is allowed here. Such threads belongs to homework section not this one.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now