Jump to content

Klaynos

Moderators
  • Posts

    8591
  • Joined

Everything posted by Klaynos

  1. As a society we tend not to pay people based on how important their work is to society.
  2. Sport? A hyperspace bypass? Because we have the potential for great atrocities? For some other reason that we have no possible way of understanding because we are completely different species?
  3. Let n=1 and substitute into your equation. Then let n=3 (you'll probably find you need n=2 too).
  4. Might be easier to just put your head in a box...
  5. Your substitution didn't work. You have a(n+1)=a(3) lhs, but then substitute in a(n+1)=a(2) in the rhs. You have defined n as both 1 and 2.
  6. Then the sensitivity isn't enough for things with low reflectivity. Your range is not great enough for everything to have an OK exposure under the same conditions.
  7. Where's the maths? You should probably do some reading about DNA and genes add what you've written falls into not even wrong.
  8. It isn't necessarily converting to white, the sub pixel is maxing out you then get leakage into the other pixels and sub pixels removing your colour differences. How you get the light onto the CCD isn't really important it's how the sensors respond to light. The camera is never seeing it as a human. Humans use nonlinear sensors with high and variable dynamic range, even HDR sensors cannot reproduce this.
  9. With the caveats that you also have noise and leakage around the two ends and the individual sub pixels will not all be identical in their response. Both on a single pixel and compared between pixels.
  10. I'd imagine that the annihilation energy would at least ionise the oxygen, not sure what putting that much energy into an atom would do...
  11. Yes but your sensor isn't looking at 620nm it's got 3 color filters per pixel. If you start over loading any of them you make it even harder to distinguish between colours. That's why you need to use a spectrometer. I'd suggest reading about how CCD cameras work as applied to color spaces and then extending that to over and under exposure. The killer is that they're linear with quite a small range of how many photons they need to kick enough elections to work.
  12. The exposure will effect the response of the CCD or CMOS. You can't avoid that. There are physical problems caused by changing the amount of light on the sensor. Even having the same object with the same light source will rarely give the same response on the same sensor after you've moved the object. This gets even worse when you start using different areas of the same array or different sensor arrays. You cannot do normalised spectroscopy using digital photography.
  13. That's almost impossible. Even under the sabre lighting conditions you can find species that will go all the way from under to over exposed due to reflectivity differences with any digital camera. It's not such a problem for human eyes because the sensors are dynamic and non linear therefore changing the "camera settings". I've said it several times above this idea is fundamental flawed.
  14. Potential with a heat sink. You're unlikely to do better for a reasonable price.
  15. I know quite a few people who work in the meteorology industry, a large fraction of them have physics backgrounds. You also have to bear in mind that some courses are more expensive to run than others. The sciences tend to need labs and lots of contract time...
  16. My two recommendations would be: Inkscape Draw.io
  17. Pigments are not the only natural colour, there's also photonic crystals. In all of this you are yet to demonstrate how you could possibly normalise two different photos. The concept is fundamentally flawed.
  18. You can't. Your source data does not contain the information you require. If I tell you I own a dog toy cannot tell me what breed it is from only that information. It is the same here your data will not allow you to do normalised spectroscopy which is what is required. Even to a low rgb (or any other color space) resolution.
  19. You cannot do normalised spectroscopy using digital photos. So you cannot look at the frequencies.
  20. I'd suggest both building up computing skills (learn and use at least one programming language maybe 2). And extra curricular activities to be able to demonstrate skills such as team work. Most applications where I am it's about being able to provide evidence to very points to get through the initial stage.
  21. Klaynos

    Reality

    I suggest that very few of your sentences can be parsed. Can you very carefully define what you mean by state?
  22. I've not read your whole post. It's quite long. But I did note the lack of maths. GR makes precise numerical predictions about gravitational lensing. These predictions match the observed evidence (again using maths). To supplant GR on this regard you would need to make better predictions using maths. It's not just about comparing pictures. The observations have also been repeated.
  23. Any reference or more unsupported assertions?
  24. Surely it would make more sense to (at least to start with) compare the states to one another. They're all operating within the constitution so the second amendment applies (although the laws can be significantly different) are there big divisions in these stats between states when normalised for population? If there are then that gives some starting point for looking at the culture and laws that are consistent and varying between the "good" and "bad" ends of the spectrum.
  25. 256 is used as it is easy to represent in binary.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.