The term "bit" is overloaded. It can mean any of the following:
The common theme across all the uses listed above is the number 2. An abstract bit is one of two values. A bit of data is a portion of a representation of a message that cuts the set of possible messages in half (i.e., the of the number of possible messages). A bit of information (aka a Shannon) is the amount of information in the answer to a yes-or-no question about which the observer was maximally uncertain (i.e., the of a probability). A bit of evidence in favor of A over B is an observation which provides twice as much support for A as it does for B (i.e., the of a relative likelihood). Thus, if you see someone using bits as units (a bit of data, or a bit of evidence, etc.) you can bet that they took a of something somewhere along the way.
Unfortunately, abstract bits break this pattern, so if you see someone talking about "bits" without disambiguating what sort of bit they mean, the most you can be sure of is that they're talking about something that has to do with the number 2. Unless they're just using the word "bit" to mean "small piece," in which case you're in a bit of trouble.