ebook img

Pattern Clustering using Cooperative Game Theory PDF

0.17 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Pattern Clustering using Cooperative Game Theory

CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 1 Pattern Clustering using Cooperative Game Theory Swapnil Dhamal, Satyanath Bhat, Anoop K R, and Varun R Embar Abstract—In this paper, we approach the classical problem of to K-means, which is not always desirable especially when clusteringusingsolution conceptsfrom cooperative game theory the classes have unequal variances or when they lack convex suchasNucleolusandShapleyvalue. Weformulatetheproblem nature. We, therefore, extend this approach to a more general of clustering as a characteristic form game and develop a novel clustering problem in R2. algorithm DRAC (Density-Restricted Agglomerative Clustering) As it will be clear in Section II, Shapley value is based for clustering. With extensive experimentation on standard data sets, we compare the performance of DRAC with that of well on average fairness, Gately point is based on stability, τ- 2 known algorithms. We show an interesting result that four value is based on efficiency while Nucleolus is based on 1 0 pporoinmtinaenndtτs-ovlualtuioenccooinncciedpetsf,orNuthcleeodluefis,neSdhacphlaeryacvtaelruiest,icGfaotremly both min-max fairness and stability. Hence, it is worthwhile 2 exploring these solution concepts to harness their properties game. Thisvindicatesthechoiceofthecharacteristicfunctionof for the clustering game. Of these solution concepts, the n theclusteringgameandalsoprovidesstrongintuitivefoundation a for our approach. properties of Nucleolus, viz., fairness and stability, are the J most suitable for the clustering game. Moreover, we show Index Terms—Pattern clustering, Characteristic form game, 2 Nucleolus, Shapley value. in Section V that all these solution concepts coincide for the chosen characteristic function. As finding Nucleolus, for ] T instance,iscomputationallyexpensive,itistoouradvantageif G I. INTRODUCTION we use the computationalease of other solution concepts.We CLUSTERING or unsupervised classification of patterns see in Section III that for the chosen characteristic function, . s into groups based on similarity is a very well studied the Shapley value can be computed in polynomial time. So c [ problem in pattern recognition, data mining, information re- for our algorithm, we use Shapley value, which is equivalent trieval, and related disciplines. Besides, clustering has also to using any or all of these solution concepts. 1 been used in solving extremely large scale problems. Clus- Theprimereasonforthecoincidenceoftherelevantsolution v tering also acts as a precursor to many data processing tasks conceptsisthatthecore,whichwewillseeinSectionII-A,is 1 6 including classification. According to Backer and Jain [2], in symmetricaboutasinglepointandallthesesolutionconcepts 4 cluster analysis, a group of objects is split into a number coincidewith thatverypoint.We will discussthissituationin 0 of more or less homogeneous subgroups on the basis of an detail and prove it formally in Section V. . 1 often subjectively chosen measure of similarity (i.e., chosen There have been approaches proposing the use of game 0 subjectively based on its ability to create interesting clusters) theory for pattern clustering. Garg, Narahari and Murthy 2 such that the similarity between objects within a subgroup [1] propose the use of Shapley value to give a good start 1 is larger than the similarity between objects belonging to to K-means. Gupta and Ranganathan [11], [12] use a mi- : v different subgroups. A key problem in the clustering domain croeconomic game theoretic approach for clustering, which i X is to determine the number of output clusters k. Use of simultaneouslyoptimizes two objectives,viz. compactionand cooperative game theory provides a novel way of addressing equipartitioning.BuloandPelillo[10]use theconceptofevo- r a this problem by using a variety of solution concepts. lutionary games for hypergraph clustering. Chun and Hokari In the rest of this section, we justify the use of game [8] provethe coincidence of Nucleolus and Shapley value for theoretic solution concepts, specifically Nucleolus, for pattern queueing problems. clustering,giveanintuitionwhythe varioussolutionconcepts The contributions of our work are as follows: coincide and refer to a few recent works in clustering using • We explore game theoretic solution concepts for the game theory. In Section II, we provide a brief introduction clustering problem. to the relevant solution concepts in cooperative game theory. • We prove coincidence of Nucleolus, Shapley value, Sections III explains our model and algorithm for clustering Gately point and τ-value for the defined game. based oncooperativegametheory.In SectionIV, we describe • Weproposeanalgorithm,DRAC(Density-RestrictedAg- the experimental results and provide a comparison of our glomerativeClustering),whichovercomesthelimitations algorithmwithsomeexistingrelatedones.Thecoincidenceof of K-means, Agglomerative clustering, DBSCAN [13] Nucleolus, Shapley value, Gately point and τ-value with the andOPTICS[14]usinggametheoreticsolutionconcepts. chosencharacteristicfunctionisdiscussedandformallyproved in Section V. We conclude with future work in Section VI. II. PRELIMINARIES We motivate the use of game theory for pattern clustering In this section, we provide a brief insight into the cooper- with an overview of a previous approach. SHARPC [1] pro- ative game theory concepts [4], [7], [8] viz. Core, Nucleolus, poses a novel approach to find the cluster centers in order to Shapley value, Gately point and τ-value. giveagoodstarttoK-means,whichthusresultsinthedesired A cooperative game (N,ν) consists of two parameters clustering.Thelimitationofthisapproachisthatitisrestricted N and ν. N is the set of players and ν : 2N R is → CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 2 the characteristic function. It defines the value ν(S) of any D. The Gately Point coalition S N. Playeri’spropensitytodisruptthegrandcoalitionisdefined ⊆ to be the following ratio [4]. A. The Core x ν(N i) d (x)= Pj6=i j − − (1) i Let (N,ν) be a coalitional game with transferable utility xi ν(i) − (TU). Let x = (x1,...,xn), where xi represents the payoff If d (x) is large, player i may lose something by deserting i of player i, the core consists of all payoff allocations x = the grand coalition, but others will lose a lot more. The (x1,...,xn) that satisfy the following properties. Gately point of a game is the imputation which minimizes 1) individual rationality, i.e., xi ν( i ) i N the maximum propensity to disrupt. The general way to ≥ { } ∀ ∈ 2) collective rationality i.e. Pi∈Nxi =ν(N). minimizethelargestpropensitytodisruptistomakeallofthe 3) coalitional rationality i.e. Pi∈Sxi ≥ν(S)∀S ⊆N. propensitiestodisruptequal.Whenthegameisnormalizedso A payoff allocation satisfying individual rationality and col- that ν(i)=0 for all i, the way to set all the di(x) equal is to lective rationality is called an imputation. choose xi in proportion to ν(N) ν(N i). − − ν(N) ν(N i) Gv = − − ν(N) i B. The Nucleolus Pj∈N(ν(N)−ν(N −j)) Nucleolusisanallocationthatminimizesthedissatisfaction E. The τ-value of the players from the allocation they can receive in a game[5]. For everyimputationx, considerthe excessdefined τ-valueistheuniquesolutionconceptwhichisefficientand by has the minimal right property and the restricted proportion- ality property. The reader is referred to [6] for the details of e (x)=ν(S) x S −X i these properties. For each i N, let i∈S ∈ eS(x) is a measure of unhappiness of S with x. The goal Mi(ν)=ν(N)−ν(N −i) and mi(ν)=ν(i) (2) of Nucleolus is to minimize the most unhappy coalition, Then the τ-value selects the maximal feasible allocation on i.e., largest of the e (x). The linear programming problem S the line connecting M(ν) = (M (ν)) and m(ν) = i i∈N formulation is as follows (m (ν)) [8]. For each convex game (N,ν), i i∈N min Z τ(ν)=λM(ν)+(1 λ)m(ν) (3) − subject to where λ [0,1] is chosen so as to satisfy ∈ Z+ x ν(S) S N X i ≥ ∀ ⊆ [λ(ν(N) ν(N i))+(1 λ)ν(i)]=ν(N) (4) i∈S X − − − i∈N x =ν(N) X i III. AMODEL AND ALGORITHMFORCLUSTERING BASED i∈N ON COOPERATIVEGAME THEORY The reader is referred to [7] for the detailed properties of Fortheclusteringgame,thecharacteristicfunctionischosen Nucleolus. It combines a number of fairness criteria with as in [1]. stability.Itistheimputationwhichislexicographicallycentral 1 ν(S)= f(d(i,j)) (5) and thus fair and optimum in the min-max sense. 2 X i,j∈S i6=j C. The Shapley Value In Equation 5, d is the Euclidean distance, f : d [0,1] is → a similarity function. Intuitively, if two points i and j have Any imputation φ = (φ1,...,φn) is a Shapley value if it small euclidean distance, then f(d(i,j)) approaches 1. The follows the axioms which are based on the idea of fairness. similarity function that we use in our implementation is The reader is referred to [4] for the detailed axioms. For any general coalitional game with transferable utility (N,ν), the d(i,j) f(d(i,j))=1 (6) Shapley value of player i is given by − d M where d is the maximum of the distances between all pairs M 1 of points in the dataset. φ = (S 1)!(n S )![ν(S) ν(S i)] i n!X | |− −| | − − When Equation 5 is used as characteristic function, it is i∈S shown in [1] that Shapley value of player i can be computed 1 = xπ in polynomial time and is given by n! X i π∈Π 1 φ = f(d(i,j)) (7) Π= set of all permutations on N i 2 X xπi = contribution of player i to permutation π jj∈6=Ni CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 3 Also, from Equation 5, it can be derived that comparedtothedensityaroundtheclustercenterofthecluster ofwhichitisapartof,itshouldnotberesponsibleforfurther ν(S)= ν(T) (8) X growthofthecluster.Thisensuresthatclustersarenotmerged T⊆S |T|=2 togetherwhentheyareconnectedwithathinbridgeofpoints. It also ensures that the density within a cluster does not vary In Sections I and II, we have discussed the benefits of beyond a certain limit. We implement this idea with what we imputations resulting from various game theoretic solution call an expansion queue. We add points to the queue only if concepts. Also, in Section V, we will show that all these their Shapley value is at least γ-multiple of that of the cluster imputationscoincide.Moreover,asEquation7showstheease center of the cluster of which it is a part of. The expansion of computation of Shapley value in the clustering game with queue is responsible for the growth of a cluster and it ceases the chosen characteristic function, we use Shapley value as once the queue is empty. The detailed and systematic steps the base solution concept for our algorithm. are given in Algorithm 1. The basic idea behind the algorithm is that we expand the clusters based on density. From Equations 6 and 7, Shapley IV. EXPERIMENTALRESULTS value represents density in some sense. For every cluster, we start with an unallocated point with the maximum Shapley Inthissection,wequalitativelycompareouralgorithmwith value and assign it as the cluster center. If that point has some existing related algorithms. SHARPC [1] gives a good high density around it, it should only consider the close-by starttoK-meansusingagametheoreticsolutionconcept,viz., points, otherwise it should consider more faraway points. We the Shapley value. As our algorithm hierarchically allocates implement this idea in step 5 of Algorithm 1 with parameter pointstotheclusterstartingfromaclustercenter,wecompare β. For the point with the globally maximum Shapley value, it with Agglomerative Clustering. The way our characteristic β = δ, while it is low for other cluster centers. Also, as functionandsimilarityfunctionaredefined,theShapleyvalue we go from cluster center with the highest Shapley value represents density in some sense. So we compare our algo- to those with lower values, we do not want to degrade the rithm with the density-based ones, viz., DBSCAN (Density- valueofβ linearly.Sowe havesquare-rootfunctioninstep 5. Based Spatial Clustering of Applications with Noise) and Alternatively,itcanbereplacedwithanyotherfunctionwhich OPTICS (Ordering Points To Identify the Clustering Struc- ensures sub-linear degradation of β. The input parameters δ ture). Throughout this section, ‘cluster (<colored marker>)’ and γ should be changed accordingly. refers to the cluster marked by that colored marker in the corresponding figure. Noise is represented by ( ). ◦ Algorithm 1 Density-Restricted Agglomerative Clustering (DRAC) SHARPC 600 Require: Dataset,maximumthresholdforsimilarityδ [0,1] ∈ and threshold for Shapley value multiplicity γ [0,1] ∈ 1: Find the pairwise similarity between all points in dataset. 500 2: For each point i, compute the Shapley value using Equa- 400 tions 6 and 7. 3: Arrangethepointsinnon-increasingorderoftheirShapley 300 values.Letg betheglobalmaximumofShapleyvalues. M Start a new queue, let’s call it expansion queue. 4: Start a new cluster. Of all the unallocated points, choose 200 thepointwithmaximumShapleyvalueasthenewcluster center. Let l be its Shapley value. Mark that point as 100 M allocated. Add it to the expansion queue. 5: Set β =δqglMM. 00 100 200 300 400 500 600 700 6: For each unallocated point, if the similarity of that point to the first point in the expansion queue is at least β, add it to the current cluster and mark it as allocated. If the Fig.1. Clusters asdiscovered bySHARPC Shapley value of that point is at least γ-multiple of l , M add it to the expansion queue. Figure 1 shows the clusters formed by SHARPC [1] which 7: Remove the first point from the expansion queue. tries to allocate clusters by enclosing points in equal-sized 8: If the expansion queue is not empty, go to step 6. spheres. It cannot detect clusters that are not convex. Also, 9: If the cluster center is the only point in its cluster, mark the cluster ( ) is a merging of three different clusters. If the it as noise. × thresholdisincreasedsoastosolvethesecondproblem,more 10: If all points are allocated a cluster, terminate. Else go to clusters are formedand the largerclusters get subdividedinto step 4. several smaller clusters. Agglomerative Clustering, as Figure 2 shows, can detect Secondly, when the density around a point is very low as clusters of any shape and size. But owing to a constant CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 4 Agglomerative Clustering OPTICS 600 600 500 500 400 400 300 300 200 200 100 100 0 0 0 100 200 300 400 500 600 700 0 100 200 300 400 500 600 700 Fig.2. Clusters asdiscovered byAgglomerative Clustering Fig.4. Clusters asdiscovered byOPTICS DBSCAN Density Restricted Agglomerative Clustering 600 600 500 500 400 400 300 300 200 200 100 100 0 0 0 100 200 300 400 500 600 700 0 100 200 300 400 500 600 700 Fig.3. Clusters asdiscovered byDBSCAN Fig.5. Clusters asdiscovered byDRAC threshold for the growth of all clusters, it faces the problem Figure 4. Unlike DBSCAN, clusters ( ) and ( ) are detected ∗ ∗ of forming several clusters in the lower right part when they as distinct. However, the points in the lower right part are should have been part of one single cluster. If the threshold detectedasnoisewhentheyshouldhavebeenclassifiedasone is decreased so as to solve this problem, clusters ( ) and ( ) cluster.Thereachabilityplotsfordifferentvaluesofminptsare ∗ ∗ getmerged.Anotherproblemisthatthebridgeconnectingthe such that an attemptto classify some of these pointsas a part two classes merges these into one single cluster ( ). ofsome clusterleadsto themergingofclusters( )and( ).If ∗ ∗ ∗ Figure 3 shows the results of DBSCAN [13]. It is well we continue trying to get more of these points allocated, the known that it cannot detect clusters with different densities bridge plays the role of merging the two clusters ( ) and ( ). ∗ ∗ in general. The points in the lower right part are detected Figure 5 shows the clustering obtained using Density- as noise when intuitively, the region is dense enough to be Restricted Agglomerative Clustering (DRAC). As cluster (+) classified as a cluster. An attempt to do so compromises the is highlydense, its cluster center has very high Shapleyvalue classification of clusters ( ) and ( ) as distinct. Moreover,the resulting in a very high value of β, the similarity threshold. ∗ ∗ bridgeconnectingthetwoclassesmergesthemintoonesingle Nopointincluster( )crossestherequiredsimilaritythreshold ∗ cluster ( ). An attempt to do the required classification leads with the points in cluster (+), thus ensuring that the two ∗ to unnecessary subdivision of the rightmost class and more clusters are not merged. The points in the central part of points being detected as noise. the bridge have extremely low Shapley values as compared The clustering obtained using OPTICS [14] is shown in to the cluster center of cluster (+) and so they fail to cross CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 5 theShapleyvaluethresholdofhavingatleastγ-multipleofthe AB,CD,EF correspondtocoalitionalrationalityconstraints. Shapley value of the cluster center. This ensures that they are The reader is referred to [4] for a detailed discussion on notaddedto the expansionqueueofthe cluster,thusavoiding imputationtriangleofa3-playercooperativegame.By simple the cluster growthto extendto cluster ( ).Cluster ( ) extends geometry and theory on imputation triangle, it can be seen ∗ ∗ to the relatively low density region because of points being that AB = DE = ν( 2,3 )√2. Similarly, all opposite sides { } added to the expansion queue owing to their sufficiently high of the core are equal and so the core is symmetric about its Shapley value, at least γ-multiple of the Shapley value of the center P. cluster center. Cluster ( ) is a low density cluster owing to Clearly, any point other than P will have more distance × the low Shapley value of the cluster center and so low value fromat least oneside and so will be lexicographicallygreater of β, the similarity threshold, thus allowing more faraway than P, which means that P is the Nucleolus. Also, as the points to be a part of the cluster. Cluster centers, which fail core is symmetric, it is intuitive that P is the fairest of all to agglomerate at least one point with their respective values allocations, which means that it corresponds to the Shapley of β, are marked as noise. value imputation. We prove a general result for n-player Like other clustering algorithms, Algorithm 1 faces some clusteringgamethatalltherelevantsolutionconceptscoincide. limitations. As it uses Equations 6 and 7 to compute the Proposition 1. For the transferable utility (TU) game defined Shapley values, the Shapley value of a point changes even by Equation 5, for each i N, the Shapley Value is given by when a remote point is altered, which may change its cluster ∈ 1 allocation. For the same reason, the Shapley values of the φ = ν(S) (9) points close to the mean of the whole dataset is higher than i 2 X S⊆N otherpointsevenwhenthedensityaroundthemisnotashigh. i∈S |S|=2 One solution to this problem is to take the positioning of the point into account while computing its Shapley value. There Proof From Equations 5 and 7, is no explicit noise detection. A point is marked as noise if 1 φ = f(d(i,j)) it is the only point in its cluster. For instance, in Figure 5, i 2 X the two points in the upper right corner are noise points, but jj∈6=Ni owing to their low Shapley values, β is very low and so they 1 1 = f(d(k,l)) are classified as a separate cluster ( ) instead. The amortized 2 2 X time complexity of Algorithm 1 is△O(n2). S⊆N |S|=2 k,l∈S k6=l V. COINCIDENCE OF NUCLEOLUS, SHAPLEYVALUE, i∈S GATELYPOINT ANDτ-VALUE IN THECURRENT SETTING 1 1 = f(d(k,l)) 2 X 2 X In the game as defined in Section III, we show in this S⊆N k,l∈S section, that Nucleolus, Shapley value, Gately point and τ- |S|=2 k6=l valuecoincide.First, we discussthestructureofthe core.The i∈S 1 core is symmetric about a single point, which is the prime = ν(S) reason why the above solution concepts coincide with that 2 X S⊆N very point. |S|=2 i∈S 1 S A B B Lemma 1. [8] For the TU game satisfying Equation 9, for A C C each S N, P ⊆ P F ν(S)−Xφi =ν(N\S)− X φi O R 2 i∈S i∈N\S F The reader is referred to [8] for the proof of Lemma 1. E D D E Theorem 1. [8] For the TU game satisfying Equation 9, T 3 φ(ν)=Nu(ν) Fig. 6. The game has a symmetric core. This figure shows the core for a where Nu(ν) is the Nucleolus of the TU game (N,ν). 3-player game. The reader is referred to [8] for the proof of Theorem 1. Figure 6 shows the core for a 3-player cooperative game, Theorem 2. For the TU game defined by Equation 5, in our case, a 3-pointclusteringgame. The STR plane corre- φ(ν)=Gv(ν) sponds to collective rationality constraint, sides AF,BC,DE correspond to individual rationality constraints while sides where Gv(ν) is the Gately point of the TU game (N,ν). CENTENARYCONFERENCE,2011-ELECTRICALENGINEERING,INDIANINSTITUTEOFSCIENCE,BANGALORE 6 Proof By Lemma 1, when S = i , we have VI. CONCLUSION AND FUTURE WORK { } ν(i) φ =ν(N i) φ We have explored game theoretic solution concepts as an − i − −X j alternative to the existing methods, for the clustering prob- j6=i lem. Also, Nucleolus being both min-max fair and stable, From Equation 1, the propensity to disrupt for player i when is the most suitable solution concept for pattern clustering. imputation is the Shapley value is We have also proved the coincidence of Nucleolus, Shapley φ ν(N i) d (φ)= Pj6=i j − − =1 value, Gately point and τ-value for the given characteristic i φi ν(i) function. We have proposed an algorithm, Density-Restricted − As the propensityto disruptis 1 for everyplayer i, it is equal AgglomerativeClustering(DRAC),andhaveprovidedaqual- for all the playersandhence, fromthe theoryin Section II-D, itative comparison with the existing algorithms along with its the Shapley value imputation is the Gately point. strengths and limitations. Asafuturework,itwouldbeinterestingtotestourmethod φ(ν)=Gv(ν) using Evolutionary game theory and Bargaining concepts. It would be worthwhile developing a characterization of games for which various game theoretic solution concepts coincide. Theorem 3. For the TU game defined by Equation 5, VII. ACKNOWLEDGEMENT φ(ν)=τ(ν) This work is an extension of a project as part of Game Theorycourse.WethankDr.Y.Narahari,thecourseinstructor, where τ(ν) is the τ-value of the TU game (N,ν). for helping us strengthen our concepts in the subject and for Proof From Equations 2 and 8, guiding us throughout the making of this paper. We thank Avishek Chatterjee for mentoring our project, helping us get M (ν) = ν(N) ν(N i) i − − started with cooperative game theory and for the useful and = ν(S) ν(S) X − X essential criticism which helped us improve our algorithm. S⊆N S⊆N\{i} |S|=2 |S|=2 REFERENCES = ν(S) X [1] GargV.K., Narahari Y. and Murthy N.M.,Shapley Value Based Robust S⊆N i∈S Pattern Clustering, Technical report, Department of Computer Science |S|=2 andAutomation,IndianInstitute ofScience, 2011. [2] BackerE.andJainA.Aclusteringperformancemeasurebasedonfuzzy This, with Equation 4 and the fact that for our (N,ν) game, set decomposition, IEEE Transactions Pattern Analysis and Machine for all i, mi(ν)=ν(i)=0, Intelligence (PAMI),3(1),1981,pages66-75. [3] PelilloM.,Whatisacluster?Perspectivesfromgametheory,NIPSWork- ν(N) = λXMi(ν) shoponClustering: Science ofArt,2009. [4] StraffinP.D.,GameTheoryandStrategy, TheMathematical Association i∈N ofAmerica,1993,pages 202-207. = λ ν(S) [5] SchmeidlerD.,TheNucleolusofaCharacteristic FunctionGame,SIAM X X JournalonAppliedMathematics, 17(6),Nov.1969,pages1163-1170. i∈N Si⊆∈SN [6] TijsS.H.,AnAxiomizationoftheτ-value,MathematicalSocialSciences, |S|=2 13(2),1987,pages 177-181. [7] Saad W., Han Z., Debbah M., Hjorungnes A. and Basar T., Coalitional = 2λ ν(S) X Game Theory for Communication Networks: A Tutorial, IEEE Signal S⊆N ProcessingMagazine, SpecialissueonGameTheory,2009. |S|=2 [8] ChunY.andHokariT.,OntheCoincidenceoftheShapleyValueandthe Nucleolus inQueueing Problems,SeoulJournal ofEconomics, 2007. Using Equation 8, we get λ = 1. This, with Equation 3 and [9] Kohlberg E., On the nucleolus of a characteristic function game, SIAM 2 JournalonAppliedMathematics, Vol.20,1971,pages 62-66. the fact that for all i, m (ν)=0, i [10] Bulo S.R. and Pelillo M., A game-theoretic approach to hypergraph 1 clustering, AdvancesinNeuralInformation ProcessingSystems,2009. τi(ν)= 2 X ν(S) [11]oGbjuecpttiaveU.spaantdialRcalnugsatenraitnhga,n1N9t.h, AIntmerincraotieocnoanlomCiocnfaeprpenrocaechontoPmatutelrtin- S⊆N i∈S Recognition, 2008,pages1-4. |S|=2 [12] Gupta U. and Ranganathan N., A Game Theoretic Approach for Si- multaneousCompactionandEquipartitioning ofSpatialDataSets,IEEE This, with Proposition 1, gives TransactionsonKnowledgeandDataEngineering,2009,pages465-478. φ(ν)=τ(ν) [13] EsterM.KriegelH.P.,SanderJ.andXuX.,ADensity-BasedAlgorithm forDiscoveringClustersinLargeSpatialDatabaseswithNoise,Proceed- ings of the 2nd International Conference on Knowledge Discovery and Datamining,1996,pages226-231. [14] Ankerst M., Breunig M.M., Kriegel H.P. and Sander J. OPTICS: or- deringpointstoidentifytheclusteringstructure,ACMSIGMODRecord, FromTheorem1,Theorem2andTheorem3,theNucleolus, 28(2),1999,pages 49-60. the Shapley value, the Gately point and the τ-value coincide in theclusteringgamewiththe chosencharacteristicfunction. These results further vindicate our choice of characteristic function for the clustering game.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.