A learning thread?

Started by Baron von Lotsov, November 01, 2019, 09:04:56 PM

« previous - next »

0 Members and 1 Guest are viewing this topic.

patman post

Quote from: Nalaar post_id=21191 time=1586708565 user_id=99
No.  

No apple products have thermal cameras, any apps with 'thermal filters' are just using visible light data to overlay thermal pallet tones on the image.

Quote from: papasmurf post_id=21192 time=1586708764 user_id=89
A digital thermometer  is only about £5 anyway.


Thanks both. It seemed too good to be true...
On climate change — we're talking, we're beginning to act, but we're still not doing enough...

papasmurf

Quote from: "patman post" post_id=21190 time=1586708055 user_id=70
Does anyone know if the thermal camera on Apple Photo Box can be used as a guide for personal raised temperature during this current pandemic?



Although this thread so far seems to have been populated almost exclusively by one erstwhile poster, I'm hoping this won't frighten away anyone who has an idea to answer my question.

Thanks...


A digital thermometer  is only about £5 anyway.
Nemini parco qui vivit in orbe

Nalaar

Quote from: "patman post" post_id=21190 time=1586708055 user_id=70
Does anyone know if the thermal camera on Apple Photo Box can be used as a guide for personal raised temperature during this current pandemic?


No.  

No apple products have thermal cameras, any apps with 'thermal filters' are just using visible light data to overlay thermal pallet tones on the image.
Don't believe everything you think.

patman post

Does anyone know if the thermal camera on Apple Photo Box can be used as a guide for personal raised temperature during this current pandemic?



Although this thread so far seems to have been populated almost exclusively by one erstwhile poster, I'm hoping this won't frighten away anyone who has an idea to answer my question.

Thanks...
On climate change — we're talking, we're beginning to act, but we're still not doing enough...

Baron von Lotsov

I've been looking at programming languages recently and there are a couple of new ones from people who do internet browsers. There is a language called Go which came from Google and a language called Rust from the Firefox people. Well not the firm itself but some people who worked there put it together in their spare time and released it.



Many of these programming videos are very long-winded, but I found this one which gives you a summary of what Rust does and what it does not do. Aside from the normal stuff it is a language that compiles into a very small space. That's The opposite to some god-damn awful language like Microsoft's dot Net. Generally speaking small is fast. You don't want 100mb of crap tagged on like I've seen in some cases. Another thing is it caters for low-level coding, which means you can write super fast programs as you have complete control of what is going on. The other thing is the philosophy that the compiler should catch as many errors as you make them. Visual Studio is quite good at this with its intellisense technology which has been around for a few years and improves as time goes on. Rust though is supposed to do much more and try and catch them all before you ever press compile. I can see this greatly speeding up development.



People have been saying it kicks ass over C++. C++ is said to be rather obscure, like a geeks language, but not so user-friendly. Javascript was a language that tried to be the opposite but was considered not a professional language, like amateurish and prone to error due to loose typing. You know - like the language a kid would use, and then we got Python which was ever more piss easy to pick up and for a while was the craze, but now people are wondering about Rust. Those who have tried it swear by it. I guess Go was supposed to be the latest new wonder thing but I am hearing people don't like it. We've seen this before between Myspace and Facebook. The customer is always right.







By the way, it was named after a fungus, not iron oxide.
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Another troubling problem is this is built on quantum electrodynamics, which is where and why Richard Feynman is so famous. It uses the SU(1) group to explain electromagnetic forces where YM extends it to the other two.



On the plus side we have a basis which conforms very well with experiment to explain the Lamb shift, but we have a thing used called renormalisation, which is something Lamb himself disagreed strongly with, saying there is no logical basis for it, as per an arbitrary bodge.



https://en.wikipedia.org/wiki/Renormalization">https://en.wikipedia.org/wiki/Renormalization



I guess you may be getting the idea this YM theory is built upon a large mountain of maths which seems to go on forever. Anyhow I've picked out the controversial bits.
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Probably the greatest invention so far in maths centres around the Yang–Mills theory. It's still far from certain whether it all works or it does not, but this theory is the basis of our Standard Model in physics. It's the theory of life - lol.



It has to be added here that the thing here is only a part of what had been developed in maths over a long time, indeed taking maths from the 19th century. We start with some differential geometry, and something called group theory. This is a higher abstraction of maths, and we work it all through to see how well the model fits. It turns out it can explain the electromagnetic force, and the weak and strong nuclear forces, but not gravity. This result is well known in pop physics and so too is something called symmetry breaking. This is the controversy, where we would have zero mass of particles with perfect symmetry. Perfect symmetry would mean the universe would not exist. It's just as well we don't have it, but then again since we don't have a complete set of working equations, it might be something else. There's a prize to be won for anyone who figures these problems out.



Anyway, here it is in mathematical form.

https://en.wikipedia.org/wiki/Yang%E2%80%93Mills_theory">https://en.wikipedia.org/wiki/Yang%E2%80%93Mills_theory
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Here's one of our great thinkers which you have probably never heard of.



Alfred North Whitehead

<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

I have found a few more resources for complexity theory. It's a funny subject in that it has different areas of study within it. So if we want to start at the very beginning for those who are new to it, there are a few good videos here.



https://www.youtube.com/channel/UCutCcajxhR33k9UR-DdLsAQ/videos">https://www.youtube.com/channel/UCutCca ... sAQ/videos">https://www.youtube.com/channel/UCutCcajxhR33k9UR-DdLsAQ/videos



The fist of these provides an introduction and then there are some others which expand on topics in a qualitative sense. This binds the subject to practical applications.







OK so when you are bored with those, as they do tend to be a little repetitive, you might like to see how the formal approach works.



One of the branches of this field is computational complexity. This is about solving problems and how long they are likely to take to solve. Let's take a simple example where one has a crap sorting algorithm. We have a load of random numbers and we want to sort them from smallest to largest. If we have only one number to sort, it is trivial. As we sort two numbers we have to do a comparison, and that is work for the computer, or what is called the cost (cost in time and resources). Next we have three to sort, then four and so on. You can easily see that as we go up in number to sort, the time it takes to run through the sorted list to find where it fits is on average going to go up in proportion to its length, call it 'n'. Therefore, as we also have n to sort, our computation time is proportional to n^2. That is known as a class of complexity. Some problems are even worse than that, some work in a time proportional to log n, as indeed the best sorting algorithm does. Some will even run at the same speed irrespective of n. Now what mathematicians do is look at all the different types and find proofs to help one understand complexity of a system, and some powerful ideas are developing. It's a pretty new subject so chances are you did not learn this at school.



This is a wiki on what the world knows about complexity classes and relations so far.



https://complexityzoo.uwaterloo.ca/Complexity_Zoo">https://complexityzoo.uwaterloo.ca/Complexity_Zoo



The site is run by:



Scott Aaronson "I'm David J. Bruton Centennial Professor of Computer Science at The University of Texas at Austin, and director of its Quantum Information Center."

&

Greg Kuperberg "I am a Professor in the Mathematics Department at UC Davis."



Like it is their day job to do this sort of stuff.



The reason for mentioning it is it is very important in today's society. It's a new paradigm, as relativity/QM was to the Newtonian world. It is also very much work in progress.
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Here's a bit of maths for you.









Actually it is bloody hard maths, so don't worry if you find it difficult to understand. I was reading some Wikipedia article on computability, i.e. what we can compute and what we can not in the theoretical sense. In looking for incomputable problems I found one in physics to do with quantum mechanics and working out if a quantum system has a spectral gap, known as an energy gap in a semiconductor. It's an impossible problem and the reference was a paper in 2015. They guy who wrote the paper was from UCL in London and is the chap in the video. This is impressive.


QuoteAs a by-product, this result finally lays to rest — in a complexity-theoretic sense — the quantum and classical embedding problems, two long-standing open problems in mathematics. (The classical problem, in particular, dates back over 70 years.)


It looks like we have a new genius, not that Blighty notices. See how few views he gets relative to the quality of the work. I looked him up and he has written many important papers. It makes you feel rather thick eh! Gödel wrote the most important mathematical paper of the century when he was 25.



So well done lad!
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Here's another paper come out on AI (2018). This is from DeepMind (formally a British firm but sold to Google). DeepMind have been messing about for some time trying to get AI to play video games. The leap forwards here is to do with strong general intelligence. You don't need to write your AI so it is good at chess if it has general intelligence abilities, and a tricky one to date has been forward planning and structuring. The video does not explain how they do it, but shows you how fast it learns a game by referencing it to human performance. The simple mechanistic games are of course far easier for machines and have much faster reaction times. It's the strategic ones where we are interested. If AI can do this it might be able to figure out complex traffic situations in driverless cars. That's all about strategy.



<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Baron von Lotsov

Here's an interesting technique. It's called sequence to sequence learning, and features in a paper in 2014 on AI. You have a sequence of data points and you want to model that, e.g. it could be a stock price going up and down, and you want a neural network to be able to predict it. You also have some other data feeds, and you don't know how relevant they are, if at all, to the process you are trying to predict. This technique apparently does a very good job of that.







By the way, if you ever wonder what any of these videos or other bits of learning material might have in common that I post. Well put it this way. If you are an industrialist helping to boost the economy of Blighty, then these ideas might help. In other words they are practical stuff for the future, not the past.
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Borchester

Quote from: "Baron von Lotsov" post_id=11319 time=1577300182 user_id=74
I dont have one and have never seen reference to one regarding this square root relationship for votes.



Can you not derive it from Bayes theorum, or another idea would be using discrete combinatorics and then that should converge to a function as n goes to infinity.


Not really. Bayes Theorem is useful if you are going blind with Venn diagrams and such, but not much use if you want to prove why the square root of a sample will provide the error bound.



Even so, thank you for the suggestion.
Algerie Francais !

Baron von Lotsov

Quote from: Borchester post_id=11289 time=1577202626 user_id=62
Yes, but where is the proof?



And why is about 68% of a sample within a standard deviation from the mean?


I dont have one and have never seen reference to one regarding this square root relationship for votes.



Can you not derive it from Bayes theorum, or another idea would be using discrete combinatorics and then that should converge to a function as n goes to infinity.
<t>Hong Kingdom: addicted to democrazy opium from Brit</t>

Borchester

Quote from: "Baron von Lotsov" post_id=8544 time=1575650686 user_id=74
The square  root comes from the Gaussian or Normal distribution function. It's like if you measure a length and you have a certain error in your measurement.  Say you take two measurements to get the length though, like if you were chopping a bit out of a 1 m rod in the middle and you had to measure from both ends, then the errors add in quadrature, like 1mm error x 2 = root 2mm


Yes, but where is the proof?



And why is about 68% of a sample within a standard deviation from the mean?
Algerie Francais !