Starting cxg2 and twitter (4000, 21170) 4000 {'C': 0.01, 'loss': 'hinge'} STARTING UNMASKING ROUND 1 deu twitter cxg2 1 precision recall f1-score support at 0.96 0.91 0.93 2722 de 0.95 0.98 0.97 5000 avg / total 0.96 0.96 0.95 7722 Reducing feature vectors. (33964, 21170) (7722, 21170) (33964, 21168) (7722, 21168) STARTING UNMASKING ROUND 2 deu twitter cxg2 2 precision recall f1-score support at 0.96 0.90 0.93 2722 de 0.95 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21168) (7722, 21168) (33964, 21166) (7722, 21166) STARTING UNMASKING ROUND 3 deu twitter cxg2 3 precision recall f1-score support at 0.96 0.90 0.93 2722 de 0.95 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21166) (7722, 21166) (33964, 21164) (7722, 21164) STARTING UNMASKING ROUND 4 deu twitter cxg2 4 precision recall f1-score support at 0.96 0.90 0.93 2722 de 0.95 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21164) (7722, 21164) (33964, 21162) (7722, 21162) STARTING UNMASKING ROUND 5 deu twitter cxg2 5 precision recall f1-score support at 0.96 0.89 0.93 2722 de 0.94 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21162) (7722, 21162) (33964, 21160) (7722, 21160) STARTING UNMASKING ROUND 6 deu twitter cxg2 6 precision recall f1-score support at 0.96 0.89 0.92 2722 de 0.94 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21160) (7722, 21160) (33964, 21158) (7722, 21158) STARTING UNMASKING ROUND 7 deu twitter cxg2 7 precision recall f1-score support at 0.95 0.89 0.92 2722 de 0.94 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21158) (7722, 21158) (33964, 21156) (7722, 21156) STARTING UNMASKING ROUND 8 deu twitter cxg2 8 precision recall f1-score support at 0.95 0.89 0.92 2722 de 0.94 0.98 0.96 5000 avg / total 0.95 0.95 0.95 7722 Reducing feature vectors. (33964, 21156) (7722, 21156) (33964, 21154) (7722, 21154) STARTING UNMASKING ROUND 9 deu twitter cxg2 9 precision recall f1-score support at 0.95 0.89 0.92 2722 de 0.94 0.97 0.96 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21154) (7722, 21154) (33964, 21152) (7722, 21152) STARTING UNMASKING ROUND 10 deu twitter cxg2 10 precision recall f1-score support at 0.95 0.89 0.92 2722 de 0.94 0.97 0.96 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21152) (7722, 21152) (33964, 21150) (7722, 21150) STARTING UNMASKING ROUND 11 deu twitter cxg2 11 precision recall f1-score support at 0.95 0.89 0.92 2722 de 0.94 0.97 0.96 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21150) (7722, 21150) (33964, 21148) (7722, 21148) STARTING UNMASKING ROUND 12 deu twitter cxg2 12 precision recall f1-score support at 0.94 0.88 0.91 2722 de 0.94 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21148) (7722, 21148) (33964, 21146) (7722, 21146) STARTING UNMASKING ROUND 13 deu twitter cxg2 13 precision recall f1-score support at 0.94 0.88 0.91 2722 de 0.94 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21146) (7722, 21146) (33964, 21144) (7722, 21144) STARTING UNMASKING ROUND 14 deu twitter cxg2 14 precision recall f1-score support at 0.94 0.87 0.91 2722 de 0.93 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21144) (7722, 21144) (33964, 21142) (7722, 21142) STARTING UNMASKING ROUND 15 deu twitter cxg2 15 precision recall f1-score support at 0.94 0.87 0.91 2722 de 0.93 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21142) (7722, 21142) (33964, 21140) (7722, 21140) STARTING UNMASKING ROUND 16 deu twitter cxg2 16 precision recall f1-score support at 0.94 0.88 0.91 2722 de 0.93 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21140) (7722, 21140) (33964, 21138) (7722, 21138) STARTING UNMASKING ROUND 17 deu twitter cxg2 17 precision recall f1-score support at 0.94 0.87 0.91 2722 de 0.93 0.97 0.95 5000 avg / total 0.94 0.94 0.94 7722 Reducing feature vectors. (33964, 21138) (7722, 21138) (33964, 21136) (7722, 21136) STARTING UNMASKING ROUND 18 deu twitter cxg2 18 precision recall f1-score support at 0.94 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21136) (7722, 21136) (33964, 21134) (7722, 21134) STARTING UNMASKING ROUND 19 deu twitter cxg2 19 precision recall f1-score support at 0.94 0.88 0.91 2722 de 0.93 0.97 0.95 5000 avg / total 0.94 0.94 0.93 7722 Reducing feature vectors. (33964, 21134) (7722, 21134) (33964, 21132) (7722, 21132) STARTING UNMASKING ROUND 20 deu twitter cxg2 20 precision recall f1-score support at 0.94 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21132) (7722, 21132) (33964, 21130) (7722, 21130) STARTING UNMASKING ROUND 21 deu twitter cxg2 21 precision recall f1-score support at 0.94 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21130) (7722, 21130) (33964, 21128) (7722, 21128) STARTING UNMASKING ROUND 22 deu twitter cxg2 22 precision recall f1-score support at 0.93 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21128) (7722, 21128) (33964, 21126) (7722, 21126) STARTING UNMASKING ROUND 23 deu twitter cxg2 23 precision recall f1-score support at 0.94 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21126) (7722, 21126) (33964, 21124) (7722, 21124) STARTING UNMASKING ROUND 24 deu twitter cxg2 24 precision recall f1-score support at 0.93 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21124) (7722, 21124) (33964, 21122) (7722, 21122) STARTING UNMASKING ROUND 25 deu twitter cxg2 25 precision recall f1-score support at 0.93 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21122) (7722, 21122) (33964, 21120) (7722, 21120) STARTING UNMASKING ROUND 26 deu twitter cxg2 26 precision recall f1-score support at 0.93 0.87 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21120) (7722, 21120) (33964, 21118) (7722, 21118) STARTING UNMASKING ROUND 27 deu twitter cxg2 27 precision recall f1-score support at 0.93 0.86 0.90 2722 de 0.93 0.96 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21118) (7722, 21118) (33964, 21116) (7722, 21116) STARTING UNMASKING ROUND 28 deu twitter cxg2 28 precision recall f1-score support at 0.93 0.86 0.90 2722 de 0.93 0.97 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21116) (7722, 21116) (33964, 21114) (7722, 21114) STARTING UNMASKING ROUND 29 deu twitter cxg2 29 precision recall f1-score support at 0.93 0.86 0.89 2722 de 0.93 0.96 0.95 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21114) (7722, 21114) (33964, 21112) (7722, 21112) STARTING UNMASKING ROUND 30 deu twitter cxg2 30 precision recall f1-score support at 0.93 0.86 0.89 2722 de 0.93 0.96 0.94 5000 avg / total 0.93 0.93 0.93 7722 Reducing feature vectors. (33964, 21112) (7722, 21112) (33964, 21110) (7722, 21110) STARTING UNMASKING ROUND 31 deu twitter cxg2 31 precision recall f1-score support at 0.92 0.86 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21110) (7722, 21110) (33964, 21108) (7722, 21108) STARTING UNMASKING ROUND 32 deu twitter cxg2 32 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21108) (7722, 21108) (33964, 21106) (7722, 21106) STARTING UNMASKING ROUND 33 deu twitter cxg2 33 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21106) (7722, 21106) (33964, 21104) (7722, 21104) STARTING UNMASKING ROUND 34 deu twitter cxg2 34 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21104) (7722, 21104) (33964, 21102) (7722, 21102) STARTING UNMASKING ROUND 35 deu twitter cxg2 35 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21102) (7722, 21102) (33964, 21100) (7722, 21100) STARTING UNMASKING ROUND 36 deu twitter cxg2 36 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21100) (7722, 21100) (33964, 21098) (7722, 21098) STARTING UNMASKING ROUND 37 deu twitter cxg2 37 precision recall f1-score support at 0.92 0.85 0.89 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21098) (7722, 21098) (33964, 21096) (7722, 21096) STARTING UNMASKING ROUND 38 deu twitter cxg2 38 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21096) (7722, 21096) (33964, 21094) (7722, 21094) STARTING UNMASKING ROUND 39 deu twitter cxg2 39 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21094) (7722, 21094) (33964, 21092) (7722, 21092) STARTING UNMASKING ROUND 40 deu twitter cxg2 40 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21092) (7722, 21092) (33964, 21090) (7722, 21090) STARTING UNMASKING ROUND 41 deu twitter cxg2 41 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21090) (7722, 21090) (33964, 21088) (7722, 21088) STARTING UNMASKING ROUND 42 deu twitter cxg2 42 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21088) (7722, 21088) (33964, 21086) (7722, 21086) STARTING UNMASKING ROUND 43 deu twitter cxg2 43 precision recall f1-score support at 0.92 0.85 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21086) (7722, 21086) (33964, 21084) (7722, 21084) STARTING UNMASKING ROUND 44 deu twitter cxg2 44 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21084) (7722, 21084) (33964, 21082) (7722, 21082) STARTING UNMASKING ROUND 45 deu twitter cxg2 45 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21082) (7722, 21082) (33964, 21080) (7722, 21080) STARTING UNMASKING ROUND 46 deu twitter cxg2 46 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21080) (7722, 21080) (33964, 21078) (7722, 21078) STARTING UNMASKING ROUND 47 deu twitter cxg2 47 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21078) (7722, 21078) (33964, 21076) (7722, 21076) STARTING UNMASKING ROUND 48 deu twitter cxg2 48 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21076) (7722, 21076) (33964, 21074) (7722, 21074) STARTING UNMASKING ROUND 49 deu twitter cxg2 49 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21074) (7722, 21074) (33964, 21072) (7722, 21072) STARTING UNMASKING ROUND 50 deu twitter cxg2 50 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21072) (7722, 21072) (33964, 21070) (7722, 21070) STARTING UNMASKING ROUND 51 deu twitter cxg2 51 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21070) (7722, 21070) (33964, 21068) (7722, 21068) STARTING UNMASKING ROUND 52 deu twitter cxg2 52 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21068) (7722, 21068) (33964, 21066) (7722, 21066) STARTING UNMASKING ROUND 53 deu twitter cxg2 53 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.92 7722 Reducing feature vectors. (33964, 21066) (7722, 21066) (33964, 21064) (7722, 21064) STARTING UNMASKING ROUND 54 deu twitter cxg2 54 precision recall f1-score support at 0.92 0.84 0.88 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.91 7722 Reducing feature vectors. (33964, 21064) (7722, 21064) (33964, 21062) (7722, 21062) STARTING UNMASKING ROUND 55 deu twitter cxg2 55 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21062) (7722, 21062) (33964, 21060) (7722, 21060) STARTING UNMASKING ROUND 56 deu twitter cxg2 56 precision recall f1-score support at 0.92 0.83 0.87 2722 de 0.91 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21060) (7722, 21060) (33964, 21058) (7722, 21058) STARTING UNMASKING ROUND 57 deu twitter cxg2 57 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21058) (7722, 21058) (33964, 21056) (7722, 21056) STARTING UNMASKING ROUND 58 deu twitter cxg2 58 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21056) (7722, 21056) (33964, 21054) (7722, 21054) STARTING UNMASKING ROUND 59 deu twitter cxg2 59 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21054) (7722, 21054) (33964, 21052) (7722, 21052) STARTING UNMASKING ROUND 60 deu twitter cxg2 60 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.94 5000 avg / total 0.92 0.92 0.91 7722 Reducing feature vectors. (33964, 21052) (7722, 21052) (33964, 21050) (7722, 21050) STARTING UNMASKING ROUND 61 deu twitter cxg2 61 precision recall f1-score support at 0.92 0.84 0.87 2722 de 0.91 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21050) (7722, 21050) (33964, 21048) (7722, 21048) STARTING UNMASKING ROUND 62 deu twitter cxg2 62 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.94 5000 avg / total 0.91 0.92 0.91 7722 Reducing feature vectors. (33964, 21048) (7722, 21048) (33964, 21046) (7722, 21046) STARTING UNMASKING ROUND 63 deu twitter cxg2 63 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21046) (7722, 21046) (33964, 21044) (7722, 21044) STARTING UNMASKING ROUND 64 deu twitter cxg2 64 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21044) (7722, 21044) (33964, 21042) (7722, 21042) STARTING UNMASKING ROUND 65 deu twitter cxg2 65 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.96 0.94 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21042) (7722, 21042) (33964, 21040) (7722, 21040) STARTING UNMASKING ROUND 66 deu twitter cxg2 66 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.96 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21040) (7722, 21040) (33964, 21038) (7722, 21038) STARTING UNMASKING ROUND 67 deu twitter cxg2 67 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.96 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21038) (7722, 21038) (33964, 21036) (7722, 21036) STARTING UNMASKING ROUND 68 deu twitter cxg2 68 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.92 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21036) (7722, 21036) (33964, 21034) (7722, 21034) STARTING UNMASKING ROUND 69 deu twitter cxg2 69 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21034) (7722, 21034) (33964, 21032) (7722, 21032) STARTING UNMASKING ROUND 70 deu twitter cxg2 70 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21032) (7722, 21032) (33964, 21030) (7722, 21030) STARTING UNMASKING ROUND 71 deu twitter cxg2 71 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21030) (7722, 21030) (33964, 21028) (7722, 21028) STARTING UNMASKING ROUND 72 deu twitter cxg2 72 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21028) (7722, 21028) (33964, 21026) (7722, 21026) STARTING UNMASKING ROUND 73 deu twitter cxg2 73 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21026) (7722, 21026) (33964, 21024) (7722, 21024) STARTING UNMASKING ROUND 74 deu twitter cxg2 74 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21024) (7722, 21024) (33964, 21022) (7722, 21022) STARTING UNMASKING ROUND 75 deu twitter cxg2 75 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21022) (7722, 21022) (33964, 21020) (7722, 21020) STARTING UNMASKING ROUND 76 deu twitter cxg2 76 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21020) (7722, 21020) (33964, 21018) (7722, 21018) STARTING UNMASKING ROUND 77 deu twitter cxg2 77 precision recall f1-score support at 0.91 0.84 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21018) (7722, 21018) (33964, 21016) (7722, 21016) STARTING UNMASKING ROUND 78 deu twitter cxg2 78 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21016) (7722, 21016) (33964, 21014) (7722, 21014) STARTING UNMASKING ROUND 79 deu twitter cxg2 79 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21014) (7722, 21014) (33964, 21012) (7722, 21012) STARTING UNMASKING ROUND 80 deu twitter cxg2 80 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21012) (7722, 21012) (33964, 21010) (7722, 21010) STARTING UNMASKING ROUND 81 deu twitter cxg2 81 precision recall f1-score support at 0.91 0.83 0.87 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21010) (7722, 21010) (33964, 21008) (7722, 21008) STARTING UNMASKING ROUND 82 deu twitter cxg2 82 precision recall f1-score support at 0.91 0.83 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21008) (7722, 21008) (33964, 21006) (7722, 21006) STARTING UNMASKING ROUND 83 deu twitter cxg2 83 precision recall f1-score support at 0.91 0.83 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21006) (7722, 21006) (33964, 21004) (7722, 21004) STARTING UNMASKING ROUND 84 deu twitter cxg2 84 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21004) (7722, 21004) (33964, 21002) (7722, 21002) STARTING UNMASKING ROUND 85 deu twitter cxg2 85 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21002) (7722, 21002) (33964, 21000) (7722, 21000) STARTING UNMASKING ROUND 86 deu twitter cxg2 86 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 21000) (7722, 21000) (33964, 20998) (7722, 20998) STARTING UNMASKING ROUND 87 deu twitter cxg2 87 precision recall f1-score support at 0.90 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20998) (7722, 20998) (33964, 20996) (7722, 20996) STARTING UNMASKING ROUND 88 deu twitter cxg2 88 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 20996) (7722, 20996) (33964, 20994) (7722, 20994) STARTING UNMASKING ROUND 89 deu twitter cxg2 89 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 20994) (7722, 20994) (33964, 20992) (7722, 20992) STARTING UNMASKING ROUND 90 deu twitter cxg2 90 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.90 7722 Reducing feature vectors. (33964, 20992) (7722, 20992) (33964, 20990) (7722, 20990) STARTING UNMASKING ROUND 91 deu twitter cxg2 91 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.90 7722 Reducing feature vectors. (33964, 20990) (7722, 20990) (33964, 20988) (7722, 20988) STARTING UNMASKING ROUND 92 deu twitter cxg2 92 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 20988) (7722, 20988) (33964, 20986) (7722, 20986) STARTING UNMASKING ROUND 93 deu twitter cxg2 93 precision recall f1-score support at 0.91 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.91 7722 Reducing feature vectors. (33964, 20986) (7722, 20986) (33964, 20984) (7722, 20984) STARTING UNMASKING ROUND 94 deu twitter cxg2 94 precision recall f1-score support at 0.90 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.90 7722 Reducing feature vectors. (33964, 20984) (7722, 20984) (33964, 20982) (7722, 20982) STARTING UNMASKING ROUND 95 deu twitter cxg2 95 precision recall f1-score support at 0.90 0.82 0.86 2722 de 0.91 0.95 0.93 5000 avg / total 0.91 0.91 0.90 7722 Reducing feature vectors. (33964, 20982) (7722, 20982) (33964, 20980) (7722, 20980) STARTING UNMASKING ROUND 96 deu twitter cxg2 96 precision recall f1-score support at 0.90 0.82 0.86 2722 de 0.90 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20980) (7722, 20980) (33964, 20978) (7722, 20978) STARTING UNMASKING ROUND 97 deu twitter cxg2 97 precision recall f1-score support at 0.90 0.82 0.86 2722 de 0.90 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20978) (7722, 20978) (33964, 20976) (7722, 20976) STARTING UNMASKING ROUND 98 deu twitter cxg2 98 precision recall f1-score support at 0.90 0.81 0.86 2722 de 0.90 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20976) (7722, 20976) (33964, 20974) (7722, 20974) STARTING UNMASKING ROUND 99 deu twitter cxg2 99 precision recall f1-score support at 0.90 0.81 0.85 2722 de 0.90 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20974) (7722, 20974) (33964, 20972) (7722, 20972) STARTING UNMASKING ROUND 100 deu twitter cxg2 100 precision recall f1-score support at 0.90 0.81 0.85 2722 de 0.90 0.95 0.93 5000 avg / total 0.90 0.90 0.90 7722 Reducing feature vectors. (33964, 20972) (7722, 20972) (33964, 20970) (7722, 20970)