'Sarina Paris 01. Look At Us (Lyrics) [jSJTzfLy60s]_cropped.webm' (different version)
133 BPM, interval 3.6090225564, beat 1.94
'Look At Us Now Baby-lyrics [4om_eQ42mT4].webm'
138.067 or 138.03 BPM, beat 1.6
'Look At Us Now - Sarina Paris [z3.fm 36664329].opus'
138 BPM, beat 1.51
Audacity shows 0.0997 difference at start, reduced to 0.0476 around 9.45, with no change to end. 0.04 BPM difference over ~183 sec.
I named my newest SSD Orchid, after deleting the Ubuntu partition that I installed but never used. So I will use Orchid as the example name here, instead of "Person 1" or looking up a name from Wikipedia's lists of popular baby names, like I did with the first public argument. (Using this SSD for data means that I have no backup plan if my 17-year-old hard drive fails.)
Making up numbers instead of using a bunch of confusing variables, Orchid can take action to accomplish goal X. The difficulty of her accomplishing it is 10. The benefit to her of goal X being accomplished is not known to us, but we are trying to estimate it.
If Orchid was the only person (?) who could accomplish goal X, and she knew how difficult it was, then our task would be easy: if she tries to do it, then her benefit is 10 or more. If she doesn't try, then her benefit is less than 10.
What if other people can also do X? Let's say that X is repairing a damaged railing (??). Orchid's net benefit is her effort subtracted from her valuation of X. If her valuation is 12, and difficulty is 10, then she gains 2 if she does it herself, but 12 if someone else does it.
Suppose that I cannot do X myself. I can only pester other people until they do it. Different people have different difficulties. Suppose one person's difficulty is 50. Trying to exactly calculate things gets complicated here, but clearly some people would not want to do X without being pestered by me.
Another person's difficulty is just 10, same as for Orchid. We will call this person Jieli. Some possibilities:
1) Orchid has a 100% chance to value X.
2) There is only a 50% chance that Orchid values X.
If Orchid values X, we are assigning the arbitrary valuation of 12. If possibility 1 is true, then the expected gain for Orchid from goal X being completed is 12, while the cost to Jieli is 10. Net benefit for the whole system is +2.
If possibility 2 is true, then the expected gain for Orchid is only 6: 50% chance of 12, and 50% chance of 0. The cost to Jieli remains 10, so the expected net benefit for the whole system is -4 (a loss).
We don't know which possibility is true. But if possibility 2 is true, then I should not pester Jieli to do X, because it would be a net loss.
Some people who don't care about Jieli's wasted effort might still pester her, if they only care about gains for Orchid. Other people would not.
Orchid has the ability to reveal whether possibility 1 or possibility 2 is true. Orchid can also complete goal X herself, with difficulty 10. If Orchid does neither of these things, then it becomes more reasonable to assume that possibility 2 is true. In this way, uncertainty sort of resolves itself through people's inactions to reveal their preferences. But it only makes sense if Orchid understands that I value Jieli's effort. If people do not share the same goals of attaining the maximum benefit for the overall system, as an optimization problem, then there could be miscommunications about why action is not being taken.
Real world example: how often to take out the trash? https://youtu.be/3nllrCss2CU?t=43
If it would be easy for someone to get other people to use the idea and they don't do it, then I assume they don't want me to pester other people who would have much more difficulty convincing people to use it.
No comments:
Post a Comment