Jump to content

SarK0Y

Senior Members
  • Posts

    57
  • Joined

  • Last visited

Retained

  • Quark

SarK0Y's Achievements

Meson

Meson (3/13)

11

Reputation

  1. Exactly. With Time passing by, it becomes 100% Threat. & noone knows how to eliminate that: any solutions are too costly. + Chernobyl nests in quite seismic-stable zone, but Fukushima & others of japanese nuke plants have deal w/ immense number of quakes.
  2. @5worlds theoretically, hardened electronics may withstand at higher levels of radiation than there have been. But practically bots are rather expensive things & they, anyway, need to be maintained by humans + they're almost zero-capable to perform autonomous work, so more/less tricky operations become too laggy. Look at darpa robotic challenge top space, you can shield as much as you'd like or be able. But bottoms always have been questioned. Tooooooooooooooo crazy expensive.
  3. Fiveworlds, look at the Fukushima. have you ever seen many robots out the??? what about the dust to spread all around??? + such approach makes too high probability to propagate toxic materials through GroundWaters + concrete cracks along Time & cracking can get not too long as well.
  4. comporators/filters can be of the different kind He's got his answer
  5. Analogue computation summarize continuous functions as output. meantime, you need to infiltrate any noises. Thereby filters are comparators of AC's
  6. at asm level, loops are using jXX's, so they're just variation of IFs only situation to avoid loops is, when we get const number to run a loop.
  7. all programs use mostly precomputed consts w/ needful precision. frankly, John, i don't understand what you stand for. we use precomputed vars to boost code -- 'tis rather routine practice. we use if-reducing approches as well for. But fully IF-less codes are extremely limited things & mostly they're Just things for fun
  8. John, it's unfair if you use all available virtual space, YeZzzzzzzz -- the're no ways to go beyond. But-but-but… for 8-bit machine, it's 2^8 addresses, 64-bit -- 2^64 ones. Not juicy to init that much for each time Yes, you can avoid jXX ops in code entirely, but you need to use "call" op as alternative to jXX's. In short, you have no ways to abandon IFs as class.
  9. Sensei, what is funny w/ lookup tables -- there is very need to test the indices for proper bounds. how to test them w/o IFs???
  10. John, not sure of what you do mean & any reason of your laugh. expressions of logical & arithmetic operations w/ vars (& w/ pointers, in particurlar) definetely can substitute IFs. Another question has been, this way makes mostly a lot of "dead" runs, so performance suffers very much. Only for some cases, that substitution might pay off. much more efficiently to reduce number of cmp's op. for instance, cmp %rdx, %rax jXX jYY jZZ 'tis really significant speed-up in comparison w/ cmp %rdx, %rax jXX cmp %rdx, %rax jYY cmp %rdx, %rax jZZ
  11. John, i was playing w/ different approaches to avoid IFs as much as possible. precomputed table of func pointers ain't something new & 'tis really efficient thing for some cases. But situation of abnormal indices have been very curse for such approach. Yes, Jonh, we all are damn sure that computer deals w/ "0"s & "1"s. Meanwhile, "bad" indices can call wrong addresses. i consider you ain't gonna argue that situation is deeply bad for security & stability. in fact, it's possible to avoid IFs entirely. Practically, mostly, it has zero or (very) negative effect upon performance. here i did do some hints on such techniques. meantime, it needs some remarks: 1. your if-free code at high level could become very if-stuffed in asm output thanks to compiler. 2. for the speed-up's sake, better off to use logical AND than multiplication. ============================================ significantly more efficient techniques are placed here. Func pointers are good instrument to self-change algo on-fly, thereby it makes possible to minimize "dead" blocks. If to return precomputed matrices, i can add yet another moment: they're very bad in case of your example because you take short task & it turns into not just mindblowing waste of memory, but into waste of Time as well. E.g., deposit of $1k makes matrix 100x1000 to account each cent there. Data centers of banks shall be completely ruined by such method
  12. John, some IFs, indeed, can be avoided w/ func pointers. However, such method ain't clear for fractions. Sensei has been right upon memory penalty, but fractions make situation much worse. You have matrix 100 by 1000 & what do ye gonna do, if program calls matrix[10.4][94.2]????
  13. Actually, algo complexity (steps to resolve problem) is hardware dependent: you can have slower processor (in terms of freq.), but it could spin algo faster because of larger/better cache/memory/3OE(out-of-order execution). for instance, app gets fallen short on memory, then machine runs swaping & I/O becomes bottleneck the.
  14. amount of steps can be easily converted to Time. actlually, (for each algo) we want to know how long it takes to resolve given problem at n width (width could mean number of elements, bits of precision).
  15. take really-living problems & you'll see 'flesh & blood' of such methodics. For instance, you have polynom of complex roots & you have algo to solve polynom of real roots. What could be done? Ye can represent P(x+i*y) ==F(x, y)+i*T(x, y). in other words, the(re) becomes: T(x, y) == 0 F(x, y) == 0 ==================== So, we make polynom w/ real roots, solve it & then turn real roots back to the complex form.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.