### Meet Flink, the Apache Software Foundation's Newest Top-Level Project

Re:"Flink" means "clever"! (34 comments)

Actually, flink means nice in Danish.

Announcing: **Slashdot Deals** - Explore geek apps, games, gadgets and more. (what is this?)

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Re:"Flink" means "clever"! (34 comments)

Actually, flink means nice in Danish.

Obg. Douglas Adams quote (186 comments)

“After a fairly shaky start to the day, Arthur's mind was beginning to reassemble itself from the shell-shocked fragments the previous day had left him with.

He had found a Nutri-Matic machine which had provided him with a plastic cup filled with a liquid that was almost, but not quite, entirely unlike tea.

The way it functioned was very interesting. When the Drink button was pressed it made an instant but highly detailed examination of the subject's taste buds, a spectroscopic analysis of the subject's metabolism and then sent tiny experimental signals down the neural pathways to the taste centers of the subject's brain to see what was likely to go down well. However, no one knew quite why it did this because it invariably delivered a cupful of liquid that was almost, but not quite, entirely unlike tea.”

Plague, inc. (475 comments)

Good news: after running extensive realistic experiments using the well known disease simulator plague, inc. I can conclude that the probaility that ebola will successfully annihilate humanity is quite low (because it didnt spread first and then ramped up the mortality which seemes to be one of the only really winning strategies)! The bad news is that now might be a good idea to move to Greenland since it could still destroy everything else

Re:There is no "almost impossible" (236 comments)

Actually, you will generally know an upperbound on the length of whatever is encypted given the encrypted text. That is something, so it is not quite perfect (In theory anyway).

Re:meh (164 comments)

I think your point count against you.

I personally do not mind yards and feet too much but I dislike miles since it depends on the country.

Re:Solution (167 comments)

The question isnt how much you bet against his bet but how much you are willing to pay him to play a given game with a fixed value for the outcomes (i.e winning losing and tieing). Note this is different from a bet in exactly the way you mention with the 3rd player, except that you pay your oppeont and not someone else - i.e you lost your payment even if you tie.

If you win 75% and him the rest and you get 1 for win 0 for tie and 1 for lose, you get 1 with 75% chance and lose 1 with 25% chance (0.5 on average) and should then be willing to give him upto 0.5 before playing the game because you then come up even (equvialently the game is fair/even if you get 0.5 for winning (1 minus the 0.5 you paid upfront), -0.5 for tieing and -1.5 for losing). Similarly if you pay him 0.5 and you win 1 with 50% chance and tie and get 0 with the remaning you are still even.

Game theory theories are math theories and not physics theories. Building on some given assumptions (i.e. what we measure is what you are willing to pay upfront) the theories are correct (if we look away from posible errors in the proofs - that said this is von Neumanns normal form games we are looking at right now and the proofs are correct under the assumptions used).

Re:Solution (167 comments)

Not really

Re:Solution (167 comments)

Say you increase to paper 2/3 - x and rock 1/3 + x. Then against paper 50% you win 1/3 - x/2 of the time and he wins 1/6+ x/2 of the time. Hence you get 1/6 - x on avg. On the other hand against scissors 50% you win 1/2 of the time (after having picked yours he got one option which is losing and he play each with 50%) and he wins 1/3 -x/2 of the time for 1/6-x/2. Finding the x that maximizes the smallest is easy. It is x=0. Note that you get 1/6 against the strategy I mentioned for the other guy because you do not do anything really stupid (= scissors). Also, note that maximizing the smallest is the most important thing if he is smart enough to figure out your strategy (since he will answer with the strategy leading to the smallest nr.)

Re:No actual advantage? (167 comments)

This is right yes

Re:Simple.... Odds are even (167 comments)

His nash strategy (minimax strategy) is to play rock 1/2, scissors 1/3 and paper 1/6 (it is true that rock 1/2 and scissors 1/2 is a best response to the nash strategy, but that does not mean it is optimal against abitrary strategies). I have a long earlier post showing the strategies and so on

Re:Two Games (167 comments)

Note, I say I can argue that my strategy gives 1/6 on avg. in the sense that I did argue that in a looong post above

Re:Two Games (167 comments)

Ok lets see: you play 2/3 paper (I shorten the fraction I hope that is ok

Next, random round in which you play paper: In those you get 1/2 against his rock, and lose 1/3 against his scissors, i.e. again you gain 1/6.

Next, random round in which you play scissors: In those you get 1/6 against his paper and lose 1/2 against his rock, i.e. you LOSE 1/3.

On avg you play rock 2/3 of the time and get 1/6 in those rounds, scissors in 1/6 of the time and LOSE 1/3 and paper 1/6 of the time and get 1/6. Thus, on avg. 2/3*1/6+1/6*(-1/3)+1/6*1/6=1/12. This is below your lower bound so there is something wrong with it. (the reason is that you lose whenever a bit on avg. whenever you do not play paper).

My strategies, played against each other gives 1/6. Thus, you can not say that yours is better always. I can argue that against ANY strategy mine gets 1/6. You can not get better than 1/12 (because you get that against mine strategy for the other player). Thus, yours can not be optimal sorry

Re:I'm the author -- video solution coming soon (167 comments)

"you" are not the guy playing rock 50% of the time. "you" are the guy beating on the poor guy playing rock 50% of the time. The optimal choice is to play rock 1/3 and paper 2/3 (his is to play rock 1/2 and paper 1/6 and scissors 1/3).

Bonus question (2/3 paper 1/3 rock is opt) (167 comments)

Since I already explained the optimal solution to the basic question mentioned in the summery lets solve the bonus question too (my solution also matches the solution given in comments on the article side so it should be good (and said to be correct by the author) - note currently no answer with a high score is correct - mine has 1).

The bonus question is that you play two rounds, and your oppoent must play atleast rock once. So, if he plays something not rock in the first round he must play rock in the second and loss (you just play paper). If he plays rock in the first he can play 1/3 all in the second (which leads to a draw like normal). Thus, if he plays rock first it is like normal RPS (because he get 0 in the next). Otherwise you get one free win (for the second round).

Thus, we can model the first game of the bonus question as (where the numbers is the number of rounds he wins on avg given the choice in round 1):

R P S

R 0 -1 1

P 0 -1 -2

S -2 0 -1

Where you pick columns and him rows. We see that rock dominates paper for the row player. We get

R P S

R 0 -1 1

S -2 0 -1

For the column player, the choice of rock now dominates scissors. We get

R P

R 0 -1

S -2 0

Playing rock 1/3 and paper 2/3 for the collumn player gives -2/3 wins on avg. Similarly, the row player can get -2/3 wins on avg by playing rock 2/3 and scissors 1/3.

Re:Two Games (167 comments)

There is a flaw in your reasoning. You do not know that your oppoent flipped so you can not condition on it like you do here (you can not play paper all the time if he "flips" rock because you do not know his coin flip). If you think about it you should NEVER play scissors. In the best case for you he plays rock 50% and paper 50% and you get 0 in expectation and clearly you got an advantage so 0 is not good.

The optimal strategy is to play 1/3 rock, 2/3 paper. It gives at least 1/6 against anything he could play. He can similarly ensure that you can not get more than 1/6 a game by playing rock with probability 1/2, paper with probability 1/6 and scissors with probability 1/3. Your strategy would get less than 1/6 against that (more precisely, you get 1/6 if you play either rock or paper and you lose 1/3 if you play scissors. Therefore you get 1/6*5/6-1/3*1/6=1/12 which is less than the 1/6 you get for playing 1/3 rock and 2/3 paper).

See my above post for a indepth analysis.

Re:No actual advantage? (167 comments)

You can get an advantage. The important point is to notice that you should not play scissors ever. You can only get 0 in expectation IF he plays paper 50% and rock 50% and he gets an advantage otherwise and 0 is not good for you

Re:Solution (167 comments)

Sorry

Also, to be more precise, the strategy for player 1 is to play rock with probability 1/2, paper with probability 1/6 and scissors with probability 1/3.

Solution (167 comments)

Lets call the guy with the restriction player 1 and the other player 2.

If you think about it player 1 got 3 "pure" strategies (as in: each other strategy he can play can be seen as a mixture of these 3):

(1) rock 100%,

(2) paper 50%/rock 50% and

(3) scissors 50%/rock 50%.

Against (1) rock gives 0, paper -1 and scissors 1.

Against (2) rock gives 1/2, paper -1/2 and scissors 0.

Against (3) rock gives -1/2, paper 0 and scissors 1/2.

In each case, the number is the probability of player 1 winning minus player 2 winning.

We see that player 2 should not play scissors, because he will never gain anything from it and clearly the optimal strategy should be able to gain something (we will see that it is 1/6). Then, knowing that, paper 50%/rock 50% is better than rock 100% for player 1: If player 2 plays rock, player 1 gets 1/2 (instead of 0) and if player 2 plays scissors player 1 gains -1/2 (instead of -1).

Hence, we are down to:

(2) paper 50%/rock 50% and (3) scissors 50%/ rock 50% vs. rock and paper. If player 1 plays (2) with pr. 1/3 and (3) with probability 2/3, he loses 1/6 against both rock and paper. If player 2 plays rock with probability 1/3 and paper with probability 2/3 he gets 1/6 against (1) and 1/6 against (2). This is optimal since each player have a way to guarantee that player 1 loses 1/6 and player 2 wins 1/6. If either had a better strategy it would break the other players guarantee (note that the given strategy for player 1 wins 1/6 against scissors, again showing that it is a bad strategy for player 2 and player 2's strategy wins 1/3 against rock 100% showing that it is a bad strategy).

Re:Psyops at its finest. (216 comments)

They actually do not need to do any framing or anything. They simply give up all the files Snowden got at one time, everybody gets angry at them and in a month the furor over all this will have died down. Currently people are getting annoyed at them whenever Snowden releases a new file and that can continue for a long time yet, which seemes way worse for them. Sort of the real version of the boiling frog. In reality it jumps out when it gets too hot, but it might not if it gets warm only for a short time.

Re:*NATIONAL* pi day (180 comments)

Well, just a test: clearly it should be 22/7