Shade Posted October 2, 2022 Posted October 2, 2022 Abstract: popular systems for representing numbers in a computer are considered. Serious shortcomings of the IEEE 754 system are revealed, and options for systems to replace it are proposed. It's shown that addition is enough to perform calculations with numbers. Link to the article: github.com/shadenova/Nova/blob/main/The End of IEEE 754.pdf
Sensei Posted October 2, 2022 Posted October 2, 2022 Quote The End of IEEE 754 Not really.. Existing file formats use IEEE 754 inside their files.. If you want to maintain backward compatibility and readibility of these existing files, you must support IEEE 754. IEEE 754 is supported by hardware, CPUs, FPUs, MCUs, GPUs. Trying to change the standard in the middle of the IT age would take a dozen years.
swansont Posted October 2, 2022 Posted October 2, 2022 ! Moderator Note Making threads to link to articles is not in keeping with our rules on advertising. If you want to discuss something, post the material here.
OldChemE Posted October 2, 2022 Posted October 2, 2022 At one time, back in the 60's, I learned to program computers in machine language, and I delighted in the fact that I could actually know, step by step, how the computer was performing its operations. Since then we have reached the stage in technology where the actual operations performed by the computer are complete buried in layers of code-- and the performance is vastly enhanced. Sure-- someone who knows a lot about the esoteric details might conclude that IEEE 754 was not the best approach. BUT its the one that things are built on. What you have in in the linked article is someone who sees the inefficiencies in the "wheel of choice" and wants to re-invent the wheel. The question is, can they demonstrate a financial and sociological benefit to the user of computation devices to make the change. "This is better" doesn't cut it. 1
studiot Posted October 2, 2022 Posted October 2, 2022 28 minutes ago, OldChemE said: At one time, back in the 60's, I learned to program computers in machine language, and I delighted in the fact that I could actually know, step by step, how the computer was performing its operations. Since then we have reached the stage in technology where the actual operations performed by the computer are complete buried in layers of code-- and the performance is vastly enhanced. Sure-- someone who knows a lot about the esoteric details might conclude that IEEE 754 was not the best approach. BUT its the one that things are built on. What you have in in the linked article is someone who sees the inefficiencies in the "wheel of choice" and wants to re-invent the wheel. The question is, can they demonstrate a financial and sociological benefit to the user of computation devices to make the change. "This is better" doesn't cut it. +1 Do you knoe which IEEE standard is it that says most IT change will be a downgrading ?
Sensei Posted October 5, 2022 Posted October 5, 2022 (edited) @studiot@OldChemE BTW, do you know that there is fast trick to do sqrt in single precision IEEE 754 https://www.google.com/search?q=fast+sqrt+single+precision+ieee+754 https://en.wikipedia.org/wiki/Fast_inverse_square_root CPUs/GPUs/MICs designers should make it instruction in hardware to have ultra-fast sqrts.. Edited October 5, 2022 by Sensei
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now