Starting cxg2 and twitter (5000, 21747) 5000 {'C': 1.0, 'loss': 'hinge'} STARTING UNMASKING ROUND 1 fra twitter cxg2 1 precision recall f1-score support be 0.97 0.94 0.96 2578 fr 0.97 0.98 0.98 5000 ht 1.00 1.00 1.00 2375 lu 1.00 1.00 1.00 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21747) (12130, 21747) (53526, 21741) (12130, 21741) STARTING UNMASKING ROUND 2 fra twitter cxg2 2 precision recall f1-score support be 0.97 0.94 0.96 2578 fr 0.97 0.98 0.98 5000 ht 1.00 1.00 1.00 2375 lu 1.00 1.00 1.00 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21741) (12130, 21741) (53526, 21735) (12130, 21735) STARTING UNMASKING ROUND 3 fra twitter cxg2 3 precision recall f1-score support be 0.97 0.94 0.95 2578 fr 0.97 0.98 0.98 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 1.00 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21735) (12130, 21735) (53526, 21727) (12130, 21727) STARTING UNMASKING ROUND 4 fra twitter cxg2 4 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.97 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 1.00 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21727) (12130, 21727) (53526, 21719) (12130, 21719) STARTING UNMASKING ROUND 5 fra twitter cxg2 5 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.97 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 1.00 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21719) (12130, 21719) (53526, 21712) (12130, 21712) STARTING UNMASKING ROUND 6 fra twitter cxg2 6 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.97 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 0.99 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21712) (12130, 21712) (53526, 21707) (12130, 21707) STARTING UNMASKING ROUND 7 fra twitter cxg2 7 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.97 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 0.99 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21707) (12130, 21707) (53526, 21701) (12130, 21701) STARTING UNMASKING ROUND 8 fra twitter cxg2 8 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.97 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 0.99 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21701) (12130, 21701) (53526, 21695) (12130, 21695) STARTING UNMASKING ROUND 9 fra twitter cxg2 9 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 0.99 2177 avg / total 0.98 0.98 0.98 12130 Reducing feature vectors. (53526, 21695) (12130, 21695) (53526, 21688) (12130, 21688) STARTING UNMASKING ROUND 10 fra twitter cxg2 10 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 1.00 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21688) (12130, 21688) (53526, 21680) (12130, 21680) STARTING UNMASKING ROUND 11 fra twitter cxg2 11 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21680) (12130, 21680) (53526, 21672) (12130, 21672) STARTING UNMASKING ROUND 12 fra twitter cxg2 12 precision recall f1-score support be 0.96 0.94 0.95 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21672) (12130, 21672) (53526, 21665) (12130, 21665) STARTING UNMASKING ROUND 13 fra twitter cxg2 13 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21665) (12130, 21665) (53526, 21657) (12130, 21657) STARTING UNMASKING ROUND 14 fra twitter cxg2 14 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21657) (12130, 21657) (53526, 21649) (12130, 21649) STARTING UNMASKING ROUND 15 fra twitter cxg2 15 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.98 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21649) (12130, 21649) (53526, 21641) (12130, 21641) STARTING UNMASKING ROUND 16 fra twitter cxg2 16 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.97 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21641) (12130, 21641) (53526, 21633) (12130, 21633) STARTING UNMASKING ROUND 17 fra twitter cxg2 17 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.97 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21633) (12130, 21633) (53526, 21625) (12130, 21625) STARTING UNMASKING ROUND 18 fra twitter cxg2 18 precision recall f1-score support be 0.95 0.93 0.94 2578 fr 0.96 0.97 0.97 5000 ht 1.00 0.99 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21625) (12130, 21625) (53526, 21618) (12130, 21618) STARTING UNMASKING ROUND 19 fra twitter cxg2 19 precision recall f1-score support be 0.94 0.93 0.94 2578 fr 0.96 0.97 0.96 5000 ht 1.00 0.99 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21618) (12130, 21618) (53526, 21611) (12130, 21611) STARTING UNMASKING ROUND 20 fra twitter cxg2 20 precision recall f1-score support be 0.94 0.93 0.93 2578 fr 0.96 0.97 0.97 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21611) (12130, 21611) (53526, 21605) (12130, 21605) STARTING UNMASKING ROUND 21 fra twitter cxg2 21 precision recall f1-score support be 0.94 0.92 0.93 2578 fr 0.96 0.97 0.96 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21605) (12130, 21605) (53526, 21597) (12130, 21597) STARTING UNMASKING ROUND 22 fra twitter cxg2 22 precision recall f1-score support be 0.94 0.92 0.93 2578 fr 0.96 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21597) (12130, 21597) (53526, 21590) (12130, 21590) STARTING UNMASKING ROUND 23 fra twitter cxg2 23 precision recall f1-score support be 0.95 0.92 0.93 2578 fr 0.96 0.97 0.96 5000 ht 0.99 1.00 1.00 2375 lu 0.99 0.98 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21590) (12130, 21590) (53526, 21582) (12130, 21582) STARTING UNMASKING ROUND 24 fra twitter cxg2 24 precision recall f1-score support be 0.95 0.92 0.93 2578 fr 0.96 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21582) (12130, 21582) (53526, 21575) (12130, 21575) STARTING UNMASKING ROUND 25 fra twitter cxg2 25 precision recall f1-score support be 0.95 0.92 0.93 2578 fr 0.96 0.97 0.96 5000 ht 0.99 1.00 0.99 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21575) (12130, 21575) (53526, 21569) (12130, 21569) STARTING UNMASKING ROUND 26 fra twitter cxg2 26 precision recall f1-score support be 0.94 0.92 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21569) (12130, 21569) (53526, 21562) (12130, 21562) STARTING UNMASKING ROUND 27 fra twitter cxg2 27 precision recall f1-score support be 0.94 0.92 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 1.00 1.00 2375 lu 0.99 0.98 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21562) (12130, 21562) (53526, 21554) (12130, 21554) STARTING UNMASKING ROUND 28 fra twitter cxg2 28 precision recall f1-score support be 0.94 0.91 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 0.99 1.00 2375 lu 0.99 0.98 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21554) (12130, 21554) (53526, 21546) (12130, 21546) STARTING UNMASKING ROUND 29 fra twitter cxg2 29 precision recall f1-score support be 0.94 0.91 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 0.99 1.00 2375 lu 0.99 0.99 0.99 2177 avg / total 0.97 0.97 0.97 12130 Reducing feature vectors. (53526, 21546) (12130, 21546) (53526, 21539) (12130, 21539) STARTING UNMASKING ROUND 30 fra twitter cxg2 30 precision recall f1-score support be 0.94 0.91 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 0.99 1.00 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21539) (12130, 21539) (53526, 21531) (12130, 21531) STARTING UNMASKING ROUND 31 fra twitter cxg2 31 precision recall f1-score support be 0.94 0.91 0.93 2578 fr 0.95 0.97 0.96 5000 ht 1.00 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21531) (12130, 21531) (53526, 21523) (12130, 21523) STARTING UNMASKING ROUND 32 fra twitter cxg2 32 precision recall f1-score support be 0.94 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21523) (12130, 21523) (53526, 21516) (12130, 21516) STARTING UNMASKING ROUND 33 fra twitter cxg2 33 precision recall f1-score support be 0.94 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21516) (12130, 21516) (53526, 21508) (12130, 21508) STARTING UNMASKING ROUND 34 fra twitter cxg2 34 precision recall f1-score support be 0.94 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21508) (12130, 21508) (53526, 21501) (12130, 21501) STARTING UNMASKING ROUND 35 fra twitter cxg2 35 precision recall f1-score support be 0.94 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.99 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21501) (12130, 21501) (53526, 21493) (12130, 21493) STARTING UNMASKING ROUND 36 fra twitter cxg2 36 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21493) (12130, 21493) (53526, 21486) (12130, 21486) STARTING UNMASKING ROUND 37 fra twitter cxg2 37 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21486) (12130, 21486) (53526, 21478) (12130, 21478) STARTING UNMASKING ROUND 38 fra twitter cxg2 38 precision recall f1-score support be 0.94 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21478) (12130, 21478) (53526, 21470) (12130, 21470) STARTING UNMASKING ROUND 39 fra twitter cxg2 39 precision recall f1-score support be 0.93 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21470) (12130, 21470) (53526, 21463) (12130, 21463) STARTING UNMASKING ROUND 40 fra twitter cxg2 40 precision recall f1-score support be 0.93 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.99 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21463) (12130, 21463) (53526, 21455) (12130, 21455) STARTING UNMASKING ROUND 41 fra twitter cxg2 41 precision recall f1-score support be 0.93 0.91 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21455) (12130, 21455) (53526, 21447) (12130, 21447) STARTING UNMASKING ROUND 42 fra twitter cxg2 42 precision recall f1-score support be 0.93 0.91 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21447) (12130, 21447) (53526, 21439) (12130, 21439) STARTING UNMASKING ROUND 43 fra twitter cxg2 43 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21439) (12130, 21439) (53526, 21431) (12130, 21431) STARTING UNMASKING ROUND 44 fra twitter cxg2 44 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21431) (12130, 21431) (53526, 21423) (12130, 21423) STARTING UNMASKING ROUND 45 fra twitter cxg2 45 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21423) (12130, 21423) (53526, 21415) (12130, 21415) STARTING UNMASKING ROUND 46 fra twitter cxg2 46 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21415) (12130, 21415) (53526, 21408) (12130, 21408) STARTING UNMASKING ROUND 47 fra twitter cxg2 47 precision recall f1-score support be 0.93 0.91 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.97 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21408) (12130, 21408) (53526, 21402) (12130, 21402) STARTING UNMASKING ROUND 48 fra twitter cxg2 48 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21402) (12130, 21402) (53526, 21394) (12130, 21394) STARTING UNMASKING ROUND 49 fra twitter cxg2 49 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.97 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21394) (12130, 21394) (53526, 21386) (12130, 21386) STARTING UNMASKING ROUND 50 fra twitter cxg2 50 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.97 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21386) (12130, 21386) (53526, 21378) (12130, 21378) STARTING UNMASKING ROUND 51 fra twitter cxg2 51 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.97 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21378) (12130, 21378) (53526, 21370) (12130, 21370) STARTING UNMASKING ROUND 52 fra twitter cxg2 52 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21370) (12130, 21370) (53526, 21362) (12130, 21362) STARTING UNMASKING ROUND 53 fra twitter cxg2 53 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21362) (12130, 21362) (53526, 21354) (12130, 21354) STARTING UNMASKING ROUND 54 fra twitter cxg2 54 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21354) (12130, 21354) (53526, 21346) (12130, 21346) STARTING UNMASKING ROUND 55 fra twitter cxg2 55 precision recall f1-score support be 0.93 0.90 0.92 2578 fr 0.95 0.97 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21346) (12130, 21346) (53526, 21338) (12130, 21338) STARTING UNMASKING ROUND 56 fra twitter cxg2 56 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21338) (12130, 21338) (53526, 21330) (12130, 21330) STARTING UNMASKING ROUND 57 fra twitter cxg2 57 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21330) (12130, 21330) (53526, 21322) (12130, 21322) STARTING UNMASKING ROUND 58 fra twitter cxg2 58 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21322) (12130, 21322) (53526, 21314) (12130, 21314) STARTING UNMASKING ROUND 59 fra twitter cxg2 59 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21314) (12130, 21314) (53526, 21306) (12130, 21306) STARTING UNMASKING ROUND 60 fra twitter cxg2 60 precision recall f1-score support be 0.93 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21306) (12130, 21306) (53526, 21298) (12130, 21298) STARTING UNMASKING ROUND 61 fra twitter cxg2 61 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.96 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21298) (12130, 21298) (53526, 21290) (12130, 21290) STARTING UNMASKING ROUND 62 fra twitter cxg2 62 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21290) (12130, 21290) (53526, 21282) (12130, 21282) STARTING UNMASKING ROUND 63 fra twitter cxg2 63 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21282) (12130, 21282) (53526, 21274) (12130, 21274) STARTING UNMASKING ROUND 64 fra twitter cxg2 64 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21274) (12130, 21274) (53526, 21267) (12130, 21267) STARTING UNMASKING ROUND 65 fra twitter cxg2 65 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.98 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21267) (12130, 21267) (53526, 21259) (12130, 21259) STARTING UNMASKING ROUND 66 fra twitter cxg2 66 precision recall f1-score support be 0.92 0.90 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21259) (12130, 21259) (53526, 21251) (12130, 21251) STARTING UNMASKING ROUND 67 fra twitter cxg2 67 precision recall f1-score support be 0.92 0.89 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.98 2177 avg / total 0.96 0.96 0.96 12130 Reducing feature vectors. (53526, 21251) (12130, 21251) (53526, 21243) (12130, 21243) STARTING UNMASKING ROUND 68 fra twitter cxg2 68 precision recall f1-score support be 0.92 0.89 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21243) (12130, 21243) (53526, 21235) (12130, 21235) STARTING UNMASKING ROUND 69 fra twitter cxg2 69 precision recall f1-score support be 0.92 0.89 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21235) (12130, 21235) (53526, 21228) (12130, 21228) STARTING UNMASKING ROUND 70 fra twitter cxg2 70 precision recall f1-score support be 0.92 0.89 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.98 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21228) (12130, 21228) (53526, 21220) (12130, 21220) STARTING UNMASKING ROUND 71 fra twitter cxg2 71 precision recall f1-score support be 0.92 0.89 0.91 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.98 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21220) (12130, 21220) (53526, 21212) (12130, 21212) STARTING UNMASKING ROUND 72 fra twitter cxg2 72 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.95 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21212) (12130, 21212) (53526, 21204) (12130, 21204) STARTING UNMASKING ROUND 73 fra twitter cxg2 73 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21204) (12130, 21204) (53526, 21196) (12130, 21196) STARTING UNMASKING ROUND 74 fra twitter cxg2 74 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21196) (12130, 21196) (53526, 21188) (12130, 21188) STARTING UNMASKING ROUND 75 fra twitter cxg2 75 precision recall f1-score support be 0.91 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21188) (12130, 21188) (53526, 21180) (12130, 21180) STARTING UNMASKING ROUND 76 fra twitter cxg2 76 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21180) (12130, 21180) (53526, 21173) (12130, 21173) STARTING UNMASKING ROUND 77 fra twitter cxg2 77 precision recall f1-score support be 0.91 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21173) (12130, 21173) (53526, 21165) (12130, 21165) STARTING UNMASKING ROUND 78 fra twitter cxg2 78 precision recall f1-score support be 0.91 0.89 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.98 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21165) (12130, 21165) (53526, 21158) (12130, 21158) STARTING UNMASKING ROUND 79 fra twitter cxg2 79 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21158) (12130, 21158) (53526, 21151) (12130, 21151) STARTING UNMASKING ROUND 80 fra twitter cxg2 80 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21151) (12130, 21151) (53526, 21143) (12130, 21143) STARTING UNMASKING ROUND 81 fra twitter cxg2 81 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21143) (12130, 21143) (53526, 21135) (12130, 21135) STARTING UNMASKING ROUND 82 fra twitter cxg2 82 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21135) (12130, 21135) (53526, 21127) (12130, 21127) STARTING UNMASKING ROUND 83 fra twitter cxg2 83 precision recall f1-score support be 0.92 0.89 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21127) (12130, 21127) (53526, 21121) (12130, 21121) STARTING UNMASKING ROUND 84 fra twitter cxg2 84 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21121) (12130, 21121) (53526, 21113) (12130, 21113) STARTING UNMASKING ROUND 85 fra twitter cxg2 85 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21113) (12130, 21113) (53526, 21105) (12130, 21105) STARTING UNMASKING ROUND 86 fra twitter cxg2 86 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21105) (12130, 21105) (53526, 21097) (12130, 21097) STARTING UNMASKING ROUND 87 fra twitter cxg2 87 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21097) (12130, 21097) (53526, 21089) (12130, 21089) STARTING UNMASKING ROUND 88 fra twitter cxg2 88 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21089) (12130, 21089) (53526, 21081) (12130, 21081) STARTING UNMASKING ROUND 89 fra twitter cxg2 89 precision recall f1-score support be 0.91 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21081) (12130, 21081) (53526, 21073) (12130, 21073) STARTING UNMASKING ROUND 90 fra twitter cxg2 90 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21073) (12130, 21073) (53526, 21065) (12130, 21065) STARTING UNMASKING ROUND 91 fra twitter cxg2 91 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21065) (12130, 21065) (53526, 21057) (12130, 21057) STARTING UNMASKING ROUND 92 fra twitter cxg2 92 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21057) (12130, 21057) (53526, 21049) (12130, 21049) STARTING UNMASKING ROUND 93 fra twitter cxg2 93 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21049) (12130, 21049) (53526, 21041) (12130, 21041) STARTING UNMASKING ROUND 94 fra twitter cxg2 94 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21041) (12130, 21041) (53526, 21033) (12130, 21033) STARTING UNMASKING ROUND 95 fra twitter cxg2 95 precision recall f1-score support be 0.92 0.88 0.90 2578 fr 0.94 0.96 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21033) (12130, 21033) (53526, 21025) (12130, 21025) STARTING UNMASKING ROUND 96 fra twitter cxg2 96 precision recall f1-score support be 0.91 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.97 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21025) (12130, 21025) (53526, 21017) (12130, 21017) STARTING UNMASKING ROUND 97 fra twitter cxg2 97 precision recall f1-score support be 0.91 0.88 0.90 2578 fr 0.94 0.95 0.95 5000 ht 0.99 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21017) (12130, 21017) (53526, 21009) (12130, 21009) STARTING UNMASKING ROUND 98 fra twitter cxg2 98 precision recall f1-score support be 0.92 0.87 0.89 2578 fr 0.94 0.95 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21009) (12130, 21009) (53526, 21001) (12130, 21001) STARTING UNMASKING ROUND 99 fra twitter cxg2 99 precision recall f1-score support be 0.92 0.87 0.89 2578 fr 0.94 0.95 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 21001) (12130, 21001) (53526, 20993) (12130, 20993) STARTING UNMASKING ROUND 100 fra twitter cxg2 100 precision recall f1-score support be 0.91 0.87 0.89 2578 fr 0.94 0.95 0.95 5000 ht 0.98 0.99 0.99 2375 lu 0.96 0.97 0.97 2177 avg / total 0.95 0.95 0.95 12130 Reducing feature vectors. (53526, 20993) (12130, 20993) (53526, 20985) (12130, 20985)