With the rapid development of mobile devices and wireless technologies, mobile social networks become increasingly available. People can implement many applications on the basis of mobile social networks. Secure computation, like exchanging information and file sharing, is one of such applications. Fairness in secure computation, which means that either all parties implement the application or none of them does, is deemed as an impossible task in traditional secure computation without mobile social networks. Here we regard the applications in mobile social networks as specific functions and stress on the achievement of fairness on these functions within mobile social networks in the presence of two rational parties. Rational parties value their utilities when they participate in secure computation protocol in mobile social networks. Therefore, we introduce reputation derived from mobile social networks into the utility definition such that rational parties have incentives to implement the applications for a higher utility. To the best of our knowledge, the protocol is the first fair secure computation in mobile social networks. Furthermore, it finishes within constant rounds and allows both parties to know the terminal round.
Mobile computing and telecommunications are areas of rapid growth. A mobile social network connects individuals or organizations using off-the-shelf, sensor-enabled mobile phones with sharing of information through social networking applications such as Facebook, MySpace, and scientific collaboration networks [
In the setting of two-party games under incomplete information, two selfish parties wish to maximize their utilities with their private information. Each party has a set of strategies and certain private information-like types. Both parties take their strategies simultaneously or alternately in each round (maybe just in one shot) and the last round leads to an outcome which assigns each party a utility. Cryptography and game theory are both concerned with understanding interactions among mutually distrusted parties with conflicting interests. Cryptographic protocols are designed to protect the private inputs of each party against arbitrary behaviors, while game theory protocols are designed to reach various Nash equilibria against rational deviations.
Research shows great increases in communications through mobile phone call, text messages, and the spatial reach of social networks [
On the other hand, users in mobile social networks are assumed as rational parties who care about their utilities as those in game theory. Wang et al. [
Rational parties in secure computations are expected to cooperate with each other. However, they have no incentives to cooperate according to traditional utility definition. Therefore new utility definition must be considered assigning incentives to rational parties. With the motivation that reputation derived from mobile social networks can boost cooperation among users, we consider rational secure computation in mobile social works such that rational parties can utilize the reputation in the networks. In particular, users in the mobile social networks are willing to cooperate with those who have good reputation of cooperation. Furthermore, the good reputation can be transmitted among friends in the networks. For example, if Alice cooperated with Bob once, then Bob’s friends are willing to cooperate with Alice or Bob will cooperate with Alice when they meet again. Therefore, reputation is a useful tool to encourage mutual cooperation.
In this paper, we only consider two rational parties to securely compute a function. The parties come from a mobile social network, where they both have reputation value and use Tit-for-Tat (TFT) strategies to boost cooperation. Note that reputation affects the way parties achieve their utilities. The rational computation protocol in the presence of such rational parties is divided into several iterations. At the end of each iteration both parties gain some utilities and update their reputations. This process is similar to repeated games with stage games. Maleka et al. [
Our settings are approximately similar to those of Groce and Katz [ The main target of rational two-party computation in the mobile social networks is how to facilitate cooperation among parties in order to complete the protocol (like the prisoners’ dilemma game). In game theory scenario (especially in repeated games), TFT is an efficient strategy to promote cooperation. In fact, this seemingly simple and quite natural strategy defeats other strategies in Axelrod’s prisoners’ dilemma tournament [ In previous works, parties in rational multiparty computation have no private types. Namely, the fact that parties are rational is common knowledge (common knowledge about an event between two parties means that one party knows the event and he knows the other party knows the event too, and vice versa [ Under this scenario, parties adopt their strategies consulting the preceding actions when executing the protocol. The preceding actions form a reputation for a certain type. For example, in the mobile social networks people who often help others have a good reputation, while people who often deceive others have a bad reputation. In rational computation under incomplete information scenario, parties need to build a good reputation if they want to obtain the computation results. On the other hand, parties should show their private type to others through their actions. Otherwise, other parties may always adopt their dominating strategies which may lead to lower utilities. Traditional utility assumptions in rational multiparty computation include two sides: (i) In this paper, there are two private types of parties: rational parties who always adopt their dominating strategies and TFTer parties who follow the TFT strategies. Each party knows his own private type and has a prior probability
Loosely speaking, we assume that there are two parties (each has his private type), say
Section
We first introduce the concept of the
For simplicity, we define the following outcomes:
Here
In repeated games, parties interact in several periods and take actions simultaneously or nonsimultaneously in each stage game
Rational parties are allowed to have utilities, and then we might as well regard the rational parties as parts of a social network and endow them with an additional property like reputation. Reputation plays an important role when distrusted parties interact under incomplete information scenarios, where parties only have a prior probability on the types of other parties. The famous prisoner’s dilemma game under incomplete information accounts for how reputation encourages reciprocal cooperation in multistage games. In this paper we use reputation effects for our purpose. Put differently, a rational party values his reputation, because a high reputation can attract other parties to cooperate with him and boost his total utilities. That is, reputation makes a difference to the utilities. Precisely for this reason, we introduce another assumption on utility that rational parties under incomplete information think highly of their reputations. The definition of reputation in this paper is in accordance with [
Let
The reputation is not static. If there are no specific instructions, in the following sections, we denote by
After the
Reputation
|
Cooperation by |
Fink by |
---|---|---|
>0 |
|
|
<0 |
|
|
=0 |
|
|
Under incomplete information scenarios, each party has a private type. Here, we assume that parties have two types: rational parties maximizing their utilities and TFTer parties adopting the TFT strategy. It is obvious that the utility is higher when they both obtain the correct value than when they do not. Parties would be apt to cooperate with other parties with high reputation. The more frequently parties cooperate, the higher utilities they obtain. Thus parties have incentives to cooperate with others in order to maintain a higher reputation. Meanwhile, high reputation will in turn make it easier for the other party to cooperate. This forms a virtuous cycle. We simulate the reputation value of Definition
Reputation for each party when
We observe that the reputation will decrease once parties deviate, so parties have no incentives to deviate in each stage game if they want to preserve a higher reputation. Thus far, we give the third assumption if the protocol is considered to be a long-term process.
In fact, the reputation assumption is a virtual part in the definition of utilities. Its main role is to warn other parties not to fink. Otherwise, the protocol will consequently enter into mutual fink. If so, all parties will not get the correct results and their utilities will decrease in the long run. Namely, although reputation does not intervene with the direct utilities in the current iteration, it actually affects the future utilities. In the future work, we will add reputation assumption as a real part into the definition of utilities.
In the ideal world where a third trusted party (TTP) exists, it is trivial to achieve fairness. For completeness we represent the two-party ( Each party knows his private type and the other party only has a prior probability on the private type of his opponent.
Each party If Each party outputs some values and the protocol ends.
At the end of the protocol, both parties either get utility 1 (when both parties follow the protocol) or get utility 0 (when at least one party sends
It is more complex to construct a protocol completing the computation without a TTP. A hybrid protocol including two stages is first proposed as a transition. The first stage is an ideal functionality
In the second stage, one party, say
As the results of [
The protocols in this paper have finite rounds and parties know the last round when the protocols terminate. We will prove that mutual cooperation is a sequential equilibrium. To demonstrate a sequential equilibrium especially in the last round is cumbersome. Nevertheless, such a sequential equilibrium does exist [
Given
In complete information scenario, there is no two-party protocol to compute functionality
Just as Groce and Katz [
(1) If either input is of no vail, then (2) (a) Choose a value secret sharing scheme), so (b) Generate two shares (c) Randomly select two (3)
(1) where (2) using the TFT strategy. We highlight two premises. (a) Each party satisfies assumptions (a)–(c). (b) Meanwhile, parties do not know exactly whether his opponent is a TFTer party. Note: The utility assumptions and the incomplete information compel cooperation before round (3)
Given the utility assumptions (a) and (b) and reputation assumption (c), there exists a completely fair protocol
We will first analyze the protocol When When
To sum up, fairness is achieved in both settings. The round complexity is
The most important property of our protocol is the achievement of fairness in rational secure two-party computations. Although fairness is achieved in previous works, this is the first time that it is achieved through reputation assumptions, where parties in the protocol adopt TFT strategy. The property of fairness is essential in most secure multiparty computations, such as electronic voting and electronic auction. Take electronic voting; for instance, voters vote for candidates and wish to receive a fair and correct result. That is, the result cannot be biased by adversaries and should truly reflect their opinions. Traditional secure multiparty computations cannot achieve the property of fairness. Therefore, they cannot prevent adversaries from biasing the result. Fortunately, rational secure multiparty computations can realize fairness. On one hand, our rational protocols guarantee that each party may receive the same voting result. On the other hand, the adversary cannot bias the result.
The application of protocol Voters run Voters run protocol Voters output what they received in the protocol.
We prove that, given proper parameters, fairness can be achieved in protocol
The importance of security guarantee in mobile social networks and telecommunication services is rapidly increasing since the applications in mobile social networks are more and more popular. The property of fairness is becoming an eye-catching aspect in secure computation especially between two rational parties. Game theory opens up another avenue to intensively study fairness of secure multiparty computation. Asharov et al. [
Inspired by the fact that parties in mobile social networks value their reputation, which can boost cooperation between two rational parties, we modify the utility definition and allow parties to consider the effect of reputation derived from mobile social networks when they interact in the protocol. The results show that cooperation appears before the last “few” rounds even when they know the terminal round in finitely repeated games under incomplete information. Then we construct a protocol just like Groce and Katz [
An abstract of this paper has been presented in the INCOS2013 conference, pages 309–314, 2013 [
The authors declare that there is no conflict of interests regarding the publication of this paper.
This work was supported by the Natural Science Foundation of China under Grant nos. 61173139 and 61202475, Natural Science Foundation of Shandong Province under Grant no. BS2014DX016, Ph.D. Programs Foundation of Ludong University under Grant no. LY2015033.