MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
entropy
Recherche

Social media can predict what you’ll say, even if you don’t participate

mardi 22 janvier 2019, 15:26 , par Ars Technica
Enlarge (credit: Marie Slim / Flickr)
There have been a number of high-profile criminal cases that were solved using the DNA that family members of the accused placed in public databases. One lesson there is that our privacy isn't entirely under our control; by sharing DNA with you, your family has the ability to choose what everybody else knows about you.
Now, some researchers have demonstrated that something similar is true about our words. Using a database of past tweets, they were able to effectively pick out the next words a user was likely to use. But they were able to do so more effectively if they simply had access to what a person's contacts were saying on Twitter.
Entropy is inescapable
The work was done by three researchers at the University of Vermont: James Bagrow, Xipei Liu, and Lewis Mitchell. It centers on three different concepts relating to the informational content of messages on Twitter. The first is the concept of entropy, which in this context describes how many bits are, on average, needed to describe the uncertainty about future word choices. One way of looking at this is that, if you're certain the next word will be chosen from a list of 16, then the entropy will be four (24 is 16). The average social media user has a 5,000-word vocabulary, so choosing at random from among that would be an entropy of a bit more than 12. They also considered the perplexity, which is the value that arises from the entropy—16 in the example we just used where the entropy is four.
Read 12 remaining paragraphs | Comments
https://arstechnica.com/?p=1444407
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
jeu. 28 mars - 15:02 CET