Starting cxg2 and twitter (32000, 14846) 32000 {'C': 0.01, 'loss': 'squared_hinge'} STARTING UNMASKING ROUND 1 spa twitter cxg2 1 precision recall f1-score support ar 0.85 0.90 0.87 5000 cl 0.97 0.98 0.97 5000 co 0.95 0.93 0.94 4853 cr 0.91 0.87 0.89 4967 cu 0.98 0.97 0.98 3503 ec 0.98 0.98 0.98 5000 es 0.94 0.96 0.95 5000 gt 0.94 0.95 0.95 5000 hn 0.94 0.92 0.93 3780 mx 0.92 0.93 0.93 5000 ni 0.98 0.98 0.98 4550 pa 0.95 0.95 0.95 5000 py 0.93 0.94 0.93 5000 sv 0.93 0.94 0.93 5000 uy 0.88 0.85 0.86 5000 ve 0.94 0.93 0.93 5000 avg / total 0.94 0.94 0.94 76653 Reducing feature vectors. (356449, 14846) (76653, 14846) (356449, 14821) (76653, 14821) STARTING UNMASKING ROUND 2 spa twitter cxg2 2 precision recall f1-score support ar 0.85 0.89 0.87 5000 cl 0.96 0.97 0.97 5000 co 0.95 0.92 0.94 4853 cr 0.90 0.87 0.89 4967 cu 0.98 0.97 0.98 3503 ec 0.98 0.98 0.98 5000 es 0.93 0.95 0.94 5000 gt 0.93 0.94 0.94 5000 hn 0.94 0.92 0.93 3780 mx 0.91 0.93 0.92 5000 ni 0.98 0.97 0.98 4550 pa 0.94 0.94 0.94 5000 py 0.92 0.93 0.92 5000 sv 0.92 0.92 0.92 5000 uy 0.86 0.84 0.85 5000 ve 0.93 0.92 0.92 5000 avg / total 0.93 0.93 0.93 76653 Reducing feature vectors. (356449, 14821) (76653, 14821) (356449, 14791) (76653, 14791) STARTING UNMASKING ROUND 3 spa twitter cxg2 3 precision recall f1-score support ar 0.84 0.88 0.86 5000 cl 0.96 0.97 0.97 5000 co 0.95 0.92 0.93 4853 cr 0.89 0.86 0.88 4967 cu 0.98 0.97 0.97 3503 ec 0.97 0.98 0.97 5000 es 0.92 0.95 0.93 5000 gt 0.92 0.93 0.93 5000 hn 0.93 0.91 0.92 3780 mx 0.91 0.92 0.91 5000 ni 0.97 0.97 0.97 4550 pa 0.93 0.93 0.93 5000 py 0.91 0.93 0.92 5000 sv 0.92 0.92 0.92 5000 uy 0.85 0.83 0.84 5000 ve 0.92 0.90 0.91 5000 avg / total 0.92 0.92 0.92 76653 Reducing feature vectors. (356449, 14791) (76653, 14791) (356449, 14763) (76653, 14763) STARTING UNMASKING ROUND 4 spa twitter cxg2 4 precision recall f1-score support ar 0.84 0.88 0.86 5000 cl 0.96 0.97 0.96 5000 co 0.94 0.90 0.92 4853 cr 0.89 0.86 0.88 4967 cu 0.98 0.97 0.97 3503 ec 0.97 0.98 0.97 5000 es 0.92 0.94 0.93 5000 gt 0.92 0.93 0.92 5000 hn 0.93 0.91 0.92 3780 mx 0.91 0.92 0.91 5000 ni 0.97 0.97 0.97 4550 pa 0.93 0.93 0.93 5000 py 0.91 0.92 0.92 5000 sv 0.91 0.91 0.91 5000 uy 0.85 0.83 0.84 5000 ve 0.91 0.90 0.91 5000 avg / total 0.92 0.92 0.92 76653 Reducing feature vectors. (356449, 14763) (76653, 14763) (356449, 14734) (76653, 14734) STARTING UNMASKING ROUND 5 spa twitter cxg2 5 precision recall f1-score support ar 0.84 0.87 0.86 5000 cl 0.95 0.97 0.96 5000 co 0.93 0.89 0.91 4853 cr 0.89 0.86 0.87 4967 cu 0.98 0.96 0.97 3503 ec 0.97 0.98 0.97 5000 es 0.92 0.94 0.93 5000 gt 0.92 0.93 0.92 5000 hn 0.93 0.90 0.92 3780 mx 0.91 0.91 0.91 5000 ni 0.96 0.97 0.96 4550 pa 0.92 0.93 0.93 5000 py 0.91 0.92 0.91 5000 sv 0.91 0.91 0.91 5000 uy 0.84 0.82 0.83 5000 ve 0.91 0.89 0.90 5000 avg / total 0.92 0.92 0.92 76653 Reducing feature vectors. (356449, 14734) (76653, 14734) (356449, 14704) (76653, 14704) STARTING UNMASKING ROUND 6 spa twitter cxg2 6 precision recall f1-score support ar 0.84 0.87 0.86 5000 cl 0.95 0.97 0.96 5000 co 0.93 0.88 0.91 4853 cr 0.89 0.85 0.87 4967 cu 0.98 0.96 0.97 3503 ec 0.96 0.98 0.97 5000 es 0.91 0.94 0.93 5000 gt 0.91 0.92 0.92 5000 hn 0.93 0.90 0.91 3780 mx 0.90 0.91 0.90 5000 ni 0.96 0.97 0.96 4550 pa 0.92 0.93 0.92 5000 py 0.91 0.92 0.91 5000 sv 0.91 0.91 0.91 5000 uy 0.84 0.83 0.83 5000 ve 0.91 0.89 0.90 5000 avg / total 0.91 0.91 0.91 76653 Reducing feature vectors. (356449, 14704) (76653, 14704) (356449, 14674) (76653, 14674) STARTING UNMASKING ROUND 7 spa twitter cxg2 7 precision recall f1-score support ar 0.84 0.87 0.85 5000 cl 0.95 0.97 0.96 5000 co 0.92 0.88 0.90 4853 cr 0.88 0.85 0.87 4967 cu 0.98 0.96 0.97 3503 ec 0.96 0.97 0.97 5000 es 0.91 0.94 0.93 5000 gt 0.91 0.92 0.92 5000 hn 0.93 0.90 0.91 3780 mx 0.90 0.91 0.90 5000 ni 0.96 0.97 0.97 4550 pa 0.91 0.92 0.92 5000 py 0.90 0.92 0.91 5000 sv 0.90 0.90 0.90 5000 uy 0.83 0.82 0.83 5000 ve 0.91 0.89 0.90 5000 avg / total 0.91 0.91 0.91 76653 Reducing feature vectors. (356449, 14674) (76653, 14674) (356449, 14645) (76653, 14645) STARTING UNMASKING ROUND 8 spa twitter cxg2 8 precision recall f1-score support ar 0.84 0.87 0.85 5000 cl 0.95 0.96 0.96 5000 co 0.92 0.88 0.90 4853 cr 0.89 0.85 0.87 4967 cu 0.98 0.96 0.97 3503 ec 0.96 0.97 0.97 5000 es 0.91 0.94 0.92 5000 gt 0.91 0.92 0.91 5000 hn 0.93 0.90 0.91 3780 mx 0.90 0.91 0.90 5000 ni 0.96 0.97 0.97 4550 pa 0.91 0.92 0.92 5000 py 0.90 0.91 0.91 5000 sv 0.90 0.90 0.90 5000 uy 0.83 0.82 0.83 5000 ve 0.90 0.89 0.90 5000 avg / total 0.91 0.91 0.91 76653 Reducing feature vectors. (356449, 14645) (76653, 14645) (356449, 14614) (76653, 14614) STARTING UNMASKING ROUND 9 spa twitter cxg2 9 precision recall f1-score support ar 0.83 0.86 0.85 5000 cl 0.94 0.96 0.95 5000 co 0.92 0.87 0.90 4853 cr 0.89 0.85 0.87 4967 cu 0.97 0.96 0.97 3503 ec 0.96 0.97 0.97 5000 es 0.91 0.94 0.92 5000 gt 0.91 0.92 0.91 5000 hn 0.92 0.89 0.91 3780 mx 0.89 0.91 0.90 5000 ni 0.96 0.96 0.96 4550 pa 0.91 0.92 0.91 5000 py 0.90 0.91 0.91 5000 sv 0.89 0.90 0.90 5000 uy 0.83 0.82 0.82 5000 ve 0.90 0.88 0.89 5000 avg / total 0.91 0.91 0.91 76653 Reducing feature vectors. (356449, 14614) (76653, 14614) (356449, 14585) (76653, 14585) STARTING UNMASKING ROUND 10 spa twitter cxg2 10 precision recall f1-score support ar 0.83 0.86 0.84 5000 cl 0.94 0.96 0.95 5000 co 0.91 0.87 0.89 4853 cr 0.88 0.85 0.86 4967 cu 0.97 0.96 0.96 3503 ec 0.96 0.97 0.97 5000 es 0.91 0.93 0.92 5000 gt 0.90 0.92 0.91 5000 hn 0.92 0.89 0.91 3780 mx 0.89 0.90 0.90 5000 ni 0.96 0.96 0.96 4550 pa 0.91 0.91 0.91 5000 py 0.90 0.91 0.90 5000 sv 0.90 0.90 0.90 5000 uy 0.82 0.81 0.82 5000 ve 0.90 0.88 0.89 5000 avg / total 0.90 0.90 0.90 76653 Reducing feature vectors. (356449, 14585) (76653, 14585) (356449, 14553) (76653, 14553) STARTING UNMASKING ROUND 11 spa twitter cxg2 11 precision recall f1-score support ar 0.83 0.86 0.84 5000 cl 0.93 0.96 0.95 5000 co 0.91 0.87 0.89 4853 cr 0.88 0.85 0.86 4967 cu 0.97 0.96 0.96 3503 ec 0.96 0.97 0.96 5000 es 0.90 0.93 0.91 5000 gt 0.90 0.91 0.91 5000 hn 0.92 0.89 0.91 3780 mx 0.88 0.90 0.89 5000 ni 0.96 0.96 0.96 4550 pa 0.90 0.91 0.91 5000 py 0.90 0.91 0.90 5000 sv 0.90 0.90 0.90 5000 uy 0.82 0.81 0.82 5000 ve 0.90 0.88 0.89 5000 avg / total 0.90 0.90 0.90 76653 Reducing feature vectors. (356449, 14553) (76653, 14553) (356449, 14525) (76653, 14525) STARTING UNMASKING ROUND 12 spa twitter cxg2 12 precision recall f1-score support ar 0.82 0.86 0.84 5000 cl 0.93 0.95 0.94 5000 co 0.91 0.86 0.89 4853 cr 0.88 0.84 0.86 4967 cu 0.97 0.96 0.96 3503 ec 0.96 0.97 0.96 5000 es 0.90 0.93 0.92 5000 gt 0.90 0.91 0.90 5000 hn 0.92 0.89 0.90 3780 mx 0.88 0.89 0.89 5000 ni 0.96 0.96 0.96 4550 pa 0.90 0.91 0.91 5000 py 0.90 0.91 0.90 5000 sv 0.90 0.90 0.90 5000 uy 0.82 0.80 0.81 5000 ve 0.90 0.87 0.88 5000 avg / total 0.90 0.90 0.90 76653 Reducing feature vectors. (356449, 14525) (76653, 14525) (356449, 14496) (76653, 14496) STARTING UNMASKING ROUND 13 spa twitter cxg2 13 precision recall f1-score support ar 0.82 0.85 0.84 5000 cl 0.93 0.95 0.94 5000 co 0.90 0.86 0.88 4853 cr 0.88 0.84 0.86 4967 cu 0.97 0.96 0.96 3503 ec 0.96 0.97 0.96 5000 es 0.90 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.91 0.89 0.90 3780 mx 0.88 0.89 0.88 5000 ni 0.96 0.96 0.96 4550 pa 0.90 0.91 0.90 5000 py 0.90 0.91 0.90 5000 sv 0.89 0.90 0.89 5000 uy 0.81 0.80 0.81 5000 ve 0.89 0.87 0.88 5000 avg / total 0.90 0.90 0.90 76653 Reducing feature vectors. (356449, 14496) (76653, 14496) (356449, 14465) (76653, 14465) STARTING UNMASKING ROUND 14 spa twitter cxg2 14 precision recall f1-score support ar 0.81 0.85 0.83 5000 cl 0.93 0.95 0.94 5000 co 0.90 0.86 0.88 4853 cr 0.88 0.84 0.86 4967 cu 0.97 0.96 0.96 3503 ec 0.95 0.97 0.96 5000 es 0.89 0.93 0.91 5000 gt 0.88 0.90 0.89 5000 hn 0.91 0.88 0.90 3780 mx 0.87 0.88 0.88 5000 ni 0.96 0.96 0.96 4550 pa 0.90 0.91 0.90 5000 py 0.89 0.90 0.90 5000 sv 0.89 0.89 0.89 5000 uy 0.80 0.79 0.80 5000 ve 0.89 0.86 0.88 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14465) (76653, 14465) (356449, 14435) (76653, 14435) STARTING UNMASKING ROUND 15 spa twitter cxg2 15 precision recall f1-score support ar 0.81 0.84 0.83 5000 cl 0.93 0.95 0.94 5000 co 0.90 0.85 0.88 4853 cr 0.87 0.83 0.85 4967 cu 0.97 0.96 0.96 3503 ec 0.96 0.97 0.96 5000 es 0.89 0.93 0.91 5000 gt 0.88 0.90 0.89 5000 hn 0.91 0.88 0.89 3780 mx 0.87 0.88 0.87 5000 ni 0.96 0.95 0.96 4550 pa 0.89 0.90 0.90 5000 py 0.89 0.90 0.89 5000 sv 0.89 0.89 0.89 5000 uy 0.80 0.79 0.80 5000 ve 0.89 0.86 0.87 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14435) (76653, 14435) (356449, 14403) (76653, 14403) STARTING UNMASKING ROUND 16 spa twitter cxg2 16 precision recall f1-score support ar 0.81 0.84 0.83 5000 cl 0.93 0.95 0.94 5000 co 0.89 0.85 0.87 4853 cr 0.87 0.83 0.85 4967 cu 0.97 0.96 0.96 3503 ec 0.95 0.97 0.96 5000 es 0.89 0.92 0.91 5000 gt 0.88 0.90 0.89 5000 hn 0.91 0.88 0.89 3780 mx 0.87 0.87 0.87 5000 ni 0.96 0.95 0.96 4550 pa 0.89 0.90 0.89 5000 py 0.89 0.90 0.89 5000 sv 0.89 0.89 0.89 5000 uy 0.80 0.79 0.80 5000 ve 0.89 0.86 0.87 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14403) (76653, 14403) (356449, 14373) (76653, 14373) STARTING UNMASKING ROUND 17 spa twitter cxg2 17 precision recall f1-score support ar 0.81 0.84 0.82 5000 cl 0.93 0.95 0.94 5000 co 0.89 0.85 0.87 4853 cr 0.87 0.84 0.85 4967 cu 0.97 0.95 0.96 3503 ec 0.95 0.96 0.96 5000 es 0.89 0.92 0.91 5000 gt 0.88 0.89 0.88 5000 hn 0.91 0.88 0.89 3780 mx 0.87 0.87 0.87 5000 ni 0.96 0.95 0.96 4550 pa 0.89 0.90 0.89 5000 py 0.88 0.90 0.89 5000 sv 0.88 0.89 0.88 5000 uy 0.80 0.79 0.79 5000 ve 0.88 0.85 0.87 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14373) (76653, 14373) (356449, 14342) (76653, 14342) STARTING UNMASKING ROUND 18 spa twitter cxg2 18 precision recall f1-score support ar 0.81 0.84 0.82 5000 cl 0.92 0.95 0.94 5000 co 0.89 0.85 0.87 4853 cr 0.87 0.83 0.85 4967 cu 0.97 0.95 0.96 3503 ec 0.95 0.96 0.96 5000 es 0.89 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.91 0.87 0.89 3780 mx 0.86 0.87 0.87 5000 ni 0.96 0.95 0.96 4550 pa 0.89 0.90 0.89 5000 py 0.88 0.90 0.89 5000 sv 0.88 0.88 0.88 5000 uy 0.80 0.79 0.79 5000 ve 0.88 0.85 0.86 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14342) (76653, 14342) (356449, 14315) (76653, 14315) STARTING UNMASKING ROUND 19 spa twitter cxg2 19 precision recall f1-score support ar 0.81 0.84 0.82 5000 cl 0.92 0.94 0.93 5000 co 0.89 0.84 0.87 4853 cr 0.87 0.83 0.85 4967 cu 0.97 0.95 0.96 3503 ec 0.95 0.96 0.96 5000 es 0.89 0.92 0.90 5000 gt 0.87 0.89 0.88 5000 hn 0.91 0.87 0.89 3780 mx 0.86 0.87 0.86 5000 ni 0.95 0.95 0.95 4550 pa 0.89 0.90 0.89 5000 py 0.88 0.89 0.89 5000 sv 0.88 0.88 0.88 5000 uy 0.80 0.78 0.79 5000 ve 0.87 0.85 0.86 5000 avg / total 0.89 0.89 0.89 76653 Reducing feature vectors. (356449, 14315) (76653, 14315) (356449, 14284) (76653, 14284) STARTING UNMASKING ROUND 20 spa twitter cxg2 20 precision recall f1-score support ar 0.80 0.84 0.82 5000 cl 0.92 0.94 0.93 5000 co 0.89 0.84 0.87 4853 cr 0.87 0.82 0.85 4967 cu 0.97 0.95 0.96 3503 ec 0.95 0.96 0.96 5000 es 0.88 0.92 0.90 5000 gt 0.87 0.89 0.88 5000 hn 0.90 0.87 0.89 3780 mx 0.86 0.86 0.86 5000 ni 0.95 0.95 0.95 4550 pa 0.89 0.90 0.89 5000 py 0.88 0.89 0.88 5000 sv 0.88 0.88 0.88 5000 uy 0.79 0.78 0.79 5000 ve 0.88 0.84 0.86 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14284) (76653, 14284) (356449, 14253) (76653, 14253) STARTING UNMASKING ROUND 21 spa twitter cxg2 21 precision recall f1-score support ar 0.80 0.84 0.82 5000 cl 0.92 0.94 0.93 5000 co 0.88 0.84 0.86 4853 cr 0.87 0.82 0.84 4967 cu 0.97 0.95 0.96 3503 ec 0.95 0.96 0.95 5000 es 0.88 0.92 0.90 5000 gt 0.86 0.89 0.87 5000 hn 0.90 0.87 0.89 3780 mx 0.85 0.86 0.86 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.89 0.89 5000 py 0.88 0.89 0.88 5000 sv 0.87 0.88 0.88 5000 uy 0.79 0.78 0.79 5000 ve 0.87 0.84 0.86 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14253) (76653, 14253) (356449, 14221) (76653, 14221) STARTING UNMASKING ROUND 22 spa twitter cxg2 22 precision recall f1-score support ar 0.80 0.84 0.82 5000 cl 0.92 0.94 0.93 5000 co 0.88 0.84 0.86 4853 cr 0.87 0.82 0.84 4967 cu 0.96 0.95 0.95 3503 ec 0.95 0.96 0.96 5000 es 0.88 0.92 0.90 5000 gt 0.87 0.88 0.88 5000 hn 0.90 0.87 0.88 3780 mx 0.85 0.86 0.86 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.89 0.89 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.88 0.88 5000 uy 0.79 0.78 0.79 5000 ve 0.87 0.84 0.86 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14221) (76653, 14221) (356449, 14191) (76653, 14191) STARTING UNMASKING ROUND 23 spa twitter cxg2 23 precision recall f1-score support ar 0.81 0.84 0.82 5000 cl 0.92 0.94 0.93 5000 co 0.88 0.83 0.86 4853 cr 0.87 0.82 0.84 4967 cu 0.96 0.95 0.95 3503 ec 0.95 0.96 0.95 5000 es 0.88 0.92 0.90 5000 gt 0.87 0.88 0.87 5000 hn 0.90 0.87 0.88 3780 mx 0.85 0.86 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.89 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.88 0.87 5000 uy 0.79 0.78 0.79 5000 ve 0.87 0.84 0.86 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14191) (76653, 14191) (356449, 14160) (76653, 14160) STARTING UNMASKING ROUND 24 spa twitter cxg2 24 precision recall f1-score support ar 0.80 0.84 0.82 5000 cl 0.91 0.94 0.93 5000 co 0.88 0.83 0.85 4853 cr 0.87 0.82 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.95 0.96 0.95 5000 es 0.88 0.92 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.89 0.86 0.88 3780 mx 0.85 0.86 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.88 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.88 0.87 5000 uy 0.79 0.78 0.78 5000 ve 0.87 0.84 0.85 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14160) (76653, 14160) (356449, 14130) (76653, 14130) STARTING UNMASKING ROUND 25 spa twitter cxg2 25 precision recall f1-score support ar 0.80 0.83 0.82 5000 cl 0.91 0.94 0.93 5000 co 0.88 0.83 0.85 4853 cr 0.87 0.82 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.96 0.95 5000 es 0.88 0.92 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.89 0.86 0.87 3780 mx 0.85 0.86 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.88 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.88 0.87 5000 uy 0.79 0.78 0.78 5000 ve 0.87 0.84 0.85 5000 avg / total 0.88 0.88 0.88 76653 Reducing feature vectors. (356449, 14130) (76653, 14130) (356449, 14100) (76653, 14100) STARTING UNMASKING ROUND 26 spa twitter cxg2 26 precision recall f1-score support ar 0.80 0.83 0.82 5000 cl 0.91 0.94 0.93 5000 co 0.88 0.82 0.85 4853 cr 0.86 0.82 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.96 0.95 5000 es 0.88 0.92 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.88 0.85 0.87 3780 mx 0.84 0.85 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.88 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.87 0.87 5000 uy 0.79 0.77 0.78 5000 ve 0.86 0.84 0.85 5000 avg / total 0.88 0.88 0.87 76653 Reducing feature vectors. (356449, 14100) (76653, 14100) (356449, 14070) (76653, 14070) STARTING UNMASKING ROUND 27 spa twitter cxg2 27 precision recall f1-score support ar 0.80 0.83 0.82 5000 cl 0.91 0.94 0.93 5000 co 0.87 0.82 0.85 4853 cr 0.86 0.82 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.95 0.95 5000 es 0.88 0.92 0.89 5000 gt 0.86 0.88 0.87 5000 hn 0.89 0.85 0.87 3780 mx 0.84 0.85 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.88 0.88 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.87 0.87 0.87 5000 uy 0.79 0.77 0.78 5000 ve 0.86 0.83 0.85 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 14070) (76653, 14070) (356449, 14042) (76653, 14042) STARTING UNMASKING ROUND 28 spa twitter cxg2 28 precision recall f1-score support ar 0.80 0.83 0.81 5000 cl 0.91 0.94 0.93 5000 co 0.87 0.82 0.84 4853 cr 0.86 0.82 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.95 0.95 5000 es 0.88 0.91 0.89 5000 gt 0.85 0.88 0.87 5000 hn 0.88 0.85 0.87 3780 mx 0.84 0.85 0.85 5000 ni 0.95 0.95 0.95 4550 pa 0.87 0.88 0.88 5000 py 0.87 0.89 0.88 5000 sv 0.86 0.87 0.87 5000 uy 0.79 0.77 0.78 5000 ve 0.86 0.83 0.85 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 14042) (76653, 14042) (356449, 14013) (76653, 14013) STARTING UNMASKING ROUND 29 spa twitter cxg2 29 precision recall f1-score support ar 0.80 0.83 0.81 5000 cl 0.91 0.94 0.92 5000 co 0.87 0.82 0.84 4853 cr 0.86 0.81 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.95 0.95 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.88 0.86 5000 hn 0.88 0.85 0.87 3780 mx 0.84 0.85 0.85 5000 ni 0.94 0.95 0.95 4550 pa 0.87 0.88 0.87 5000 py 0.87 0.88 0.88 5000 sv 0.86 0.87 0.86 5000 uy 0.78 0.77 0.78 5000 ve 0.86 0.83 0.84 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 14013) (76653, 14013) (356449, 13981) (76653, 13981) STARTING UNMASKING ROUND 30 spa twitter cxg2 30 precision recall f1-score support ar 0.80 0.83 0.81 5000 cl 0.91 0.94 0.92 5000 co 0.87 0.82 0.84 4853 cr 0.86 0.81 0.84 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.95 0.95 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.88 0.86 5000 hn 0.88 0.85 0.86 3780 mx 0.84 0.85 0.84 5000 ni 0.95 0.95 0.95 4550 pa 0.87 0.88 0.87 5000 py 0.87 0.88 0.87 5000 sv 0.86 0.87 0.86 5000 uy 0.78 0.77 0.77 5000 ve 0.86 0.83 0.84 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 13981) (76653, 13981) (356449, 13949) (76653, 13949) STARTING UNMASKING ROUND 31 spa twitter cxg2 31 precision recall f1-score support ar 0.80 0.83 0.81 5000 cl 0.91 0.94 0.92 5000 co 0.87 0.81 0.84 4853 cr 0.86 0.81 0.83 4967 cu 0.96 0.94 0.95 3503 ec 0.94 0.95 0.95 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.88 0.86 5000 hn 0.88 0.85 0.86 3780 mx 0.84 0.85 0.84 5000 ni 0.94 0.95 0.95 4550 pa 0.87 0.88 0.87 5000 py 0.87 0.88 0.87 5000 sv 0.86 0.87 0.86 5000 uy 0.78 0.77 0.77 5000 ve 0.85 0.83 0.84 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 13949) (76653, 13949) (356449, 13918) (76653, 13918) STARTING UNMASKING ROUND 32 spa twitter cxg2 32 precision recall f1-score support ar 0.79 0.83 0.81 5000 cl 0.91 0.94 0.92 5000 co 0.86 0.81 0.84 4853 cr 0.86 0.81 0.83 4967 cu 0.95 0.94 0.94 3503 ec 0.94 0.95 0.94 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.88 0.85 0.86 3780 mx 0.83 0.85 0.84 5000 ni 0.94 0.95 0.95 4550 pa 0.87 0.88 0.87 5000 py 0.87 0.88 0.87 5000 sv 0.86 0.86 0.86 5000 uy 0.78 0.76 0.77 5000 ve 0.85 0.83 0.84 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 13918) (76653, 13918) (356449, 13887) (76653, 13887) STARTING UNMASKING ROUND 33 spa twitter cxg2 33 precision recall f1-score support ar 0.79 0.83 0.81 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.84 4853 cr 0.86 0.81 0.83 4967 cu 0.95 0.93 0.94 3503 ec 0.94 0.95 0.94 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.88 0.85 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.94 0.95 0.94 4550 pa 0.86 0.87 0.87 5000 py 0.87 0.88 0.87 5000 sv 0.86 0.86 0.86 5000 uy 0.78 0.76 0.77 5000 ve 0.85 0.82 0.84 5000 avg / total 0.87 0.87 0.87 76653 Reducing feature vectors. (356449, 13887) (76653, 13887) (356449, 13855) (76653, 13855) STARTING UNMASKING ROUND 34 spa twitter cxg2 34 precision recall f1-score support ar 0.79 0.83 0.81 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.86 0.81 0.83 4967 cu 0.95 0.93 0.94 3503 ec 0.94 0.95 0.94 5000 es 0.87 0.91 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.88 0.85 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.95 0.95 0.95 4550 pa 0.86 0.87 0.86 5000 py 0.87 0.88 0.87 5000 sv 0.86 0.86 0.86 5000 uy 0.78 0.76 0.77 5000 ve 0.85 0.82 0.84 5000 avg / total 0.87 0.87 0.86 76653 Reducing feature vectors. (356449, 13855) (76653, 13855) (356449, 13823) (76653, 13823) STARTING UNMASKING ROUND 35 spa twitter cxg2 35 precision recall f1-score support ar 0.79 0.82 0.80 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.86 0.80 0.83 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.87 0.90 0.89 5000 gt 0.84 0.87 0.86 5000 hn 0.88 0.84 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.94 0.95 0.94 4550 pa 0.86 0.87 0.87 5000 py 0.86 0.88 0.87 5000 sv 0.86 0.86 0.86 5000 uy 0.77 0.75 0.76 5000 ve 0.85 0.82 0.84 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13823) (76653, 13823) (356449, 13792) (76653, 13792) STARTING UNMASKING ROUND 36 spa twitter cxg2 36 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.85 0.80 0.83 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.87 0.90 0.88 5000 gt 0.84 0.87 0.85 5000 hn 0.88 0.84 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.94 0.94 0.94 4550 pa 0.86 0.87 0.87 5000 py 0.86 0.88 0.87 5000 sv 0.85 0.86 0.86 5000 uy 0.77 0.75 0.76 5000 ve 0.85 0.82 0.84 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13792) (76653, 13792) (356449, 13760) (76653, 13760) STARTING UNMASKING ROUND 37 spa twitter cxg2 37 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.85 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.86 0.85 5000 hn 0.88 0.84 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.94 0.94 0.94 4550 pa 0.86 0.87 0.86 5000 py 0.86 0.87 0.87 5000 sv 0.85 0.86 0.85 5000 uy 0.76 0.75 0.76 5000 ve 0.85 0.82 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13760) (76653, 13760) (356449, 13729) (76653, 13729) STARTING UNMASKING ROUND 38 spa twitter cxg2 38 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.85 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.86 0.85 5000 hn 0.88 0.84 0.86 3780 mx 0.83 0.84 0.84 5000 ni 0.94 0.94 0.94 4550 pa 0.86 0.87 0.86 5000 py 0.86 0.87 0.87 5000 sv 0.85 0.86 0.85 5000 uy 0.76 0.75 0.75 5000 ve 0.84 0.82 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13729) (76653, 13729) (356449, 13699) (76653, 13699) STARTING UNMASKING ROUND 39 spa twitter cxg2 39 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.93 0.92 5000 co 0.86 0.81 0.83 4853 cr 0.85 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.86 0.85 5000 hn 0.88 0.84 0.86 3780 mx 0.83 0.84 0.83 5000 ni 0.94 0.94 0.94 4550 pa 0.86 0.87 0.86 5000 py 0.86 0.87 0.86 5000 sv 0.85 0.86 0.85 5000 uy 0.76 0.75 0.75 5000 ve 0.84 0.82 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13699) (76653, 13699) (356449, 13667) (76653, 13667) STARTING UNMASKING ROUND 40 spa twitter cxg2 40 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.93 0.91 5000 co 0.85 0.80 0.83 4853 cr 0.84 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.86 0.85 5000 hn 0.87 0.84 0.85 3780 mx 0.83 0.84 0.83 5000 ni 0.94 0.94 0.94 4550 pa 0.85 0.86 0.86 5000 py 0.86 0.87 0.86 5000 sv 0.85 0.86 0.85 5000 uy 0.76 0.74 0.75 5000 ve 0.84 0.81 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13667) (76653, 13667) (356449, 13639) (76653, 13639) STARTING UNMASKING ROUND 41 spa twitter cxg2 41 precision recall f1-score support ar 0.78 0.82 0.80 5000 cl 0.90 0.92 0.91 5000 co 0.85 0.80 0.83 4853 cr 0.84 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.85 0.85 5000 hn 0.87 0.84 0.85 3780 mx 0.83 0.84 0.83 5000 ni 0.94 0.94 0.94 4550 pa 0.85 0.86 0.86 5000 py 0.86 0.87 0.86 5000 sv 0.85 0.86 0.85 5000 uy 0.76 0.74 0.75 5000 ve 0.84 0.81 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13639) (76653, 13639) (356449, 13607) (76653, 13607) STARTING UNMASKING ROUND 42 spa twitter cxg2 42 precision recall f1-score support ar 0.78 0.81 0.79 5000 cl 0.90 0.92 0.91 5000 co 0.85 0.80 0.83 4853 cr 0.84 0.80 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.85 0.84 5000 hn 0.87 0.84 0.85 3780 mx 0.82 0.84 0.83 5000 ni 0.93 0.94 0.94 4550 pa 0.85 0.86 0.86 5000 py 0.86 0.87 0.86 5000 sv 0.84 0.85 0.85 5000 uy 0.76 0.74 0.75 5000 ve 0.84 0.81 0.83 5000 avg / total 0.86 0.86 0.86 76653 Reducing feature vectors. (356449, 13607) (76653, 13607) (356449, 13575) (76653, 13575) STARTING UNMASKING ROUND 43 spa twitter cxg2 43 precision recall f1-score support ar 0.77 0.81 0.79 5000 cl 0.90 0.92 0.91 5000 co 0.85 0.80 0.83 4853 cr 0.84 0.79 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.95 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.84 0.85 0.84 5000 hn 0.87 0.83 0.85 3780 mx 0.82 0.84 0.83 5000 ni 0.93 0.94 0.94 4550 pa 0.85 0.86 0.86 5000 py 0.86 0.87 0.86 5000 sv 0.84 0.85 0.85 5000 uy 0.76 0.74 0.75 5000 ve 0.84 0.81 0.82 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13575) (76653, 13575) (356449, 13544) (76653, 13544) STARTING UNMASKING ROUND 44 spa twitter cxg2 44 precision recall f1-score support ar 0.78 0.81 0.79 5000 cl 0.90 0.92 0.91 5000 co 0.85 0.80 0.82 4853 cr 0.84 0.79 0.82 4967 cu 0.95 0.93 0.94 3503 ec 0.93 0.94 0.94 5000 es 0.86 0.90 0.88 5000 gt 0.83 0.85 0.84 5000 hn 0.87 0.83 0.85 3780 mx 0.82 0.83 0.83 5000 ni 0.93 0.94 0.94 4550 pa 0.85 0.86 0.86 5000 py 0.85 0.87 0.86 5000 sv 0.84 0.85 0.85 5000 uy 0.76 0.74 0.75 5000 ve 0.83 0.81 0.82 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13544) (76653, 13544) (356449, 13513) (76653, 13513) STARTING UNMASKING ROUND 45 spa twitter cxg2 45 precision recall f1-score support ar 0.77 0.81 0.79 5000 cl 0.90 0.92 0.91 5000 co 0.85 0.79 0.82 4853 cr 0.84 0.79 0.82 4967 cu 0.95 0.92 0.94 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.85 0.84 5000 hn 0.87 0.83 0.85 3780 mx 0.82 0.83 0.82 5000 ni 0.93 0.94 0.93 4550 pa 0.85 0.86 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.84 0.85 0.84 5000 uy 0.76 0.74 0.75 5000 ve 0.83 0.80 0.82 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13513) (76653, 13513) (356449, 13484) (76653, 13484) STARTING UNMASKING ROUND 46 spa twitter cxg2 46 precision recall f1-score support ar 0.77 0.81 0.79 5000 cl 0.90 0.92 0.91 5000 co 0.84 0.79 0.82 4853 cr 0.84 0.79 0.81 4967 cu 0.95 0.92 0.93 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.85 0.84 5000 hn 0.86 0.82 0.84 3780 mx 0.81 0.83 0.82 5000 ni 0.93 0.94 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.83 0.85 0.84 5000 uy 0.76 0.74 0.75 5000 ve 0.82 0.80 0.81 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13484) (76653, 13484) (356449, 13455) (76653, 13455) STARTING UNMASKING ROUND 47 spa twitter cxg2 47 precision recall f1-score support ar 0.77 0.81 0.79 5000 cl 0.89 0.92 0.90 5000 co 0.84 0.79 0.82 4853 cr 0.84 0.79 0.81 4967 cu 0.95 0.92 0.93 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.85 0.84 5000 hn 0.86 0.81 0.83 3780 mx 0.82 0.83 0.82 5000 ni 0.93 0.94 0.93 4550 pa 0.84 0.85 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.84 0.85 0.84 5000 uy 0.76 0.74 0.75 5000 ve 0.83 0.80 0.81 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13455) (76653, 13455) (356449, 13424) (76653, 13424) STARTING UNMASKING ROUND 48 spa twitter cxg2 48 precision recall f1-score support ar 0.77 0.81 0.79 5000 cl 0.89 0.92 0.90 5000 co 0.84 0.79 0.81 4853 cr 0.84 0.79 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.85 0.84 5000 hn 0.85 0.81 0.83 3780 mx 0.82 0.83 0.82 5000 ni 0.93 0.94 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.83 0.85 0.84 5000 uy 0.75 0.74 0.74 5000 ve 0.82 0.80 0.81 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13424) (76653, 13424) (356449, 13392) (76653, 13392) STARTING UNMASKING ROUND 49 spa twitter cxg2 49 precision recall f1-score support ar 0.77 0.80 0.79 5000 cl 0.89 0.92 0.90 5000 co 0.84 0.78 0.81 4853 cr 0.84 0.79 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.85 0.84 5000 hn 0.85 0.81 0.83 3780 mx 0.82 0.83 0.82 5000 ni 0.93 0.94 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.83 0.84 0.84 5000 uy 0.75 0.74 0.74 5000 ve 0.82 0.80 0.81 5000 avg / total 0.85 0.85 0.85 76653 Reducing feature vectors. (356449, 13392) (76653, 13392) (356449, 13360) (76653, 13360) STARTING UNMASKING ROUND 50 spa twitter cxg2 50 precision recall f1-score support ar 0.77 0.80 0.79 5000 cl 0.89 0.92 0.90 5000 co 0.84 0.78 0.81 4853 cr 0.83 0.78 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.92 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.84 0.84 5000 hn 0.85 0.81 0.83 3780 mx 0.81 0.83 0.82 5000 ni 0.92 0.94 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.87 0.86 5000 sv 0.83 0.84 0.84 5000 uy 0.75 0.73 0.74 5000 ve 0.83 0.79 0.81 5000 avg / total 0.85 0.85 0.84 76653 Reducing feature vectors. (356449, 13360) (76653, 13360) (356449, 13328) (76653, 13328) STARTING UNMASKING ROUND 51 spa twitter cxg2 51 precision recall f1-score support ar 0.77 0.80 0.78 5000 cl 0.89 0.92 0.90 5000 co 0.83 0.78 0.81 4853 cr 0.83 0.79 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.84 0.84 5000 hn 0.85 0.81 0.83 3780 mx 0.81 0.83 0.82 5000 ni 0.92 0.93 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.86 0.86 5000 sv 0.83 0.84 0.83 5000 uy 0.75 0.73 0.74 5000 ve 0.82 0.79 0.81 5000 avg / total 0.84 0.84 0.84 76653 Reducing feature vectors. (356449, 13328) (76653, 13328) (356449, 13296) (76653, 13296) STARTING UNMASKING ROUND 52 spa twitter cxg2 52 precision recall f1-score support ar 0.77 0.80 0.78 5000 cl 0.89 0.91 0.90 5000 co 0.83 0.78 0.81 4853 cr 0.83 0.79 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.93 5000 es 0.85 0.89 0.87 5000 gt 0.83 0.84 0.83 5000 hn 0.85 0.80 0.82 3780 mx 0.81 0.82 0.82 5000 ni 0.92 0.93 0.93 4550 pa 0.85 0.85 0.85 5000 py 0.85 0.86 0.85 5000 sv 0.83 0.84 0.83 5000 uy 0.75 0.73 0.74 5000 ve 0.82 0.79 0.81 5000 avg / total 0.84 0.84 0.84 76653 Reducing feature vectors. (356449, 13296) (76653, 13296) (356449, 13266) (76653, 13266) STARTING UNMASKING ROUND 53 spa twitter cxg2 53 precision recall f1-score support ar 0.77 0.80 0.78 5000 cl 0.89 0.91 0.90 5000 co 0.83 0.78 0.80 4853 cr 0.83 0.78 0.81 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.93 5000 es 0.85 0.88 0.86 5000 gt 0.82 0.84 0.83 5000 hn 0.84 0.80 0.82 3780 mx 0.80 0.82 0.81 5000 ni 0.92 0.93 0.93 4550 pa 0.84 0.85 0.85 5000 py 0.85 0.86 0.85 5000 sv 0.83 0.84 0.83 5000 uy 0.75 0.73 0.74 5000 ve 0.82 0.79 0.81 5000 avg / total 0.84 0.84 0.84 76653 Reducing feature vectors. (356449, 13266) (76653, 13266) (356449, 13234) (76653, 13234) STARTING UNMASKING ROUND 54 spa twitter cxg2 54 precision recall f1-score support ar 0.76 0.80 0.78 5000 cl 0.88 0.91 0.90 5000 co 0.83 0.77 0.80 4853 cr 0.83 0.78 0.80 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.93 5000 es 0.84 0.88 0.86 5000 gt 0.82 0.84 0.83 5000 hn 0.84 0.79 0.82 3780 mx 0.80 0.81 0.81 5000 ni 0.92 0.93 0.93 4550 pa 0.84 0.84 0.84 5000 py 0.84 0.86 0.85 5000 sv 0.83 0.84 0.83 5000 uy 0.74 0.72 0.73 5000 ve 0.82 0.79 0.80 5000 avg / total 0.84 0.84 0.84 76653 Reducing feature vectors. (356449, 13234) (76653, 13234) (356449, 13202) (76653, 13202) STARTING UNMASKING ROUND 55 spa twitter cxg2 55 precision recall f1-score support ar 0.76 0.80 0.78 5000 cl 0.88 0.91 0.89 5000 co 0.82 0.77 0.80 4853 cr 0.83 0.78 0.80 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.84 0.88 0.86 5000 gt 0.81 0.84 0.83 5000 hn 0.84 0.79 0.81 3780 mx 0.78 0.79 0.79 5000 ni 0.92 0.93 0.93 4550 pa 0.84 0.84 0.84 5000 py 0.84 0.86 0.85 5000 sv 0.83 0.84 0.83 5000 uy 0.74 0.72 0.73 5000 ve 0.82 0.79 0.80 5000 avg / total 0.84 0.84 0.84 76653 Reducing feature vectors. (356449, 13202) (76653, 13202) (356449, 13172) (76653, 13172) STARTING UNMASKING ROUND 56 spa twitter cxg2 56 precision recall f1-score support ar 0.76 0.80 0.78 5000 cl 0.88 0.90 0.89 5000 co 0.82 0.77 0.80 4853 cr 0.83 0.78 0.80 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.84 0.88 0.86 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.78 0.79 0.79 5000 ni 0.92 0.93 0.93 4550 pa 0.83 0.84 0.84 5000 py 0.84 0.86 0.85 5000 sv 0.82 0.83 0.83 5000 uy 0.74 0.72 0.73 5000 ve 0.82 0.79 0.80 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13172) (76653, 13172) (356449, 13141) (76653, 13141) STARTING UNMASKING ROUND 57 spa twitter cxg2 57 precision recall f1-score support ar 0.76 0.80 0.78 5000 cl 0.88 0.90 0.89 5000 co 0.82 0.77 0.79 4853 cr 0.83 0.78 0.80 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.84 0.88 0.86 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.78 0.79 0.78 5000 ni 0.92 0.93 0.93 4550 pa 0.83 0.84 0.83 5000 py 0.84 0.85 0.85 5000 sv 0.82 0.83 0.83 5000 uy 0.74 0.72 0.73 5000 ve 0.81 0.78 0.80 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13141) (76653, 13141) (356449, 13109) (76653, 13109) STARTING UNMASKING ROUND 58 spa twitter cxg2 58 precision recall f1-score support ar 0.76 0.80 0.78 5000 cl 0.88 0.90 0.89 5000 co 0.82 0.77 0.79 4853 cr 0.83 0.77 0.80 4967 cu 0.94 0.92 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.83 0.88 0.86 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.78 0.79 0.78 5000 ni 0.92 0.93 0.92 4550 pa 0.83 0.84 0.83 5000 py 0.84 0.85 0.84 5000 sv 0.82 0.83 0.83 5000 uy 0.74 0.72 0.73 5000 ve 0.81 0.78 0.80 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13109) (76653, 13109) (356449, 13077) (76653, 13077) STARTING UNMASKING ROUND 59 spa twitter cxg2 59 precision recall f1-score support ar 0.76 0.79 0.78 5000 cl 0.87 0.90 0.89 5000 co 0.82 0.77 0.79 4853 cr 0.83 0.77 0.80 4967 cu 0.94 0.91 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.83 0.88 0.85 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.77 0.79 0.78 5000 ni 0.92 0.93 0.92 4550 pa 0.83 0.84 0.83 5000 py 0.84 0.85 0.84 5000 sv 0.82 0.83 0.82 5000 uy 0.74 0.72 0.73 5000 ve 0.81 0.78 0.80 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13077) (76653, 13077) (356449, 13045) (76653, 13045) STARTING UNMASKING ROUND 60 spa twitter cxg2 60 precision recall f1-score support ar 0.76 0.79 0.78 5000 cl 0.87 0.90 0.89 5000 co 0.82 0.76 0.79 4853 cr 0.82 0.77 0.80 4967 cu 0.94 0.91 0.93 3503 ec 0.91 0.94 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.77 0.79 0.78 5000 ni 0.92 0.93 0.92 4550 pa 0.82 0.83 0.83 5000 py 0.84 0.85 0.84 5000 sv 0.82 0.83 0.82 5000 uy 0.74 0.72 0.73 5000 ve 0.81 0.78 0.79 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13045) (76653, 13045) (356449, 13013) (76653, 13013) STARTING UNMASKING ROUND 61 spa twitter cxg2 61 precision recall f1-score support ar 0.76 0.79 0.78 5000 cl 0.87 0.90 0.89 5000 co 0.82 0.76 0.79 4853 cr 0.82 0.77 0.79 4967 cu 0.94 0.91 0.93 3503 ec 0.90 0.93 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.81 0.83 0.82 5000 hn 0.84 0.79 0.81 3780 mx 0.77 0.78 0.78 5000 ni 0.92 0.93 0.92 4550 pa 0.82 0.83 0.83 5000 py 0.83 0.85 0.84 5000 sv 0.82 0.82 0.82 5000 uy 0.74 0.72 0.73 5000 ve 0.81 0.78 0.79 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 13013) (76653, 13013) (356449, 12984) (76653, 12984) STARTING UNMASKING ROUND 62 spa twitter cxg2 62 precision recall f1-score support ar 0.76 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.82 0.76 0.79 4853 cr 0.83 0.77 0.80 4967 cu 0.94 0.91 0.92 3503 ec 0.90 0.93 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.83 0.82 5000 hn 0.83 0.78 0.81 3780 mx 0.77 0.78 0.78 5000 ni 0.91 0.93 0.92 4550 pa 0.82 0.83 0.83 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.73 0.72 0.72 5000 ve 0.81 0.77 0.79 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 12984) (76653, 12984) (356449, 12952) (76653, 12952) STARTING UNMASKING ROUND 63 spa twitter cxg2 63 precision recall f1-score support ar 0.76 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.82 0.76 0.79 4853 cr 0.82 0.77 0.79 4967 cu 0.94 0.91 0.93 3503 ec 0.90 0.94 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.83 0.82 5000 hn 0.83 0.78 0.81 3780 mx 0.77 0.78 0.77 5000 ni 0.91 0.93 0.92 4550 pa 0.82 0.83 0.83 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.73 0.71 0.72 5000 ve 0.81 0.77 0.79 5000 avg / total 0.83 0.83 0.83 76653 Reducing feature vectors. (356449, 12952) (76653, 12952) (356449, 12922) (76653, 12922) STARTING UNMASKING ROUND 64 spa twitter cxg2 64 precision recall f1-score support ar 0.76 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.81 0.76 0.78 4853 cr 0.82 0.77 0.79 4967 cu 0.94 0.91 0.92 3503 ec 0.90 0.93 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.83 0.78 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.93 0.92 4550 pa 0.82 0.83 0.83 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.73 0.71 0.72 5000 ve 0.81 0.77 0.79 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12922) (76653, 12922) (356449, 12890) (76653, 12890) STARTING UNMASKING ROUND 65 spa twitter cxg2 65 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.81 0.76 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.94 0.91 0.92 3503 ec 0.90 0.93 0.92 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.83 0.78 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.93 0.92 4550 pa 0.82 0.83 0.82 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.73 0.70 0.72 5000 ve 0.81 0.77 0.79 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12890) (76653, 12890) (356449, 12860) (76653, 12860) STARTING UNMASKING ROUND 66 spa twitter cxg2 66 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.81 0.76 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.94 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.83 0.78 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.93 0.92 4550 pa 0.82 0.83 0.82 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.72 0.70 0.71 5000 ve 0.81 0.77 0.79 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12860) (76653, 12860) (356449, 12831) (76653, 12831) STARTING UNMASKING ROUND 67 spa twitter cxg2 67 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.90 0.88 5000 co 0.81 0.75 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.83 0.78 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.92 0.92 4550 pa 0.82 0.83 0.82 5000 py 0.83 0.85 0.84 5000 sv 0.81 0.82 0.82 5000 uy 0.72 0.71 0.71 5000 ve 0.80 0.77 0.79 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12831) (76653, 12831) (356449, 12800) (76653, 12800) STARTING UNMASKING ROUND 68 spa twitter cxg2 68 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.81 0.75 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.82 0.77 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.92 0.92 4550 pa 0.82 0.83 0.82 5000 py 0.82 0.84 0.83 5000 sv 0.81 0.82 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.80 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12800) (76653, 12800) (356449, 12768) (76653, 12768) STARTING UNMASKING ROUND 69 spa twitter cxg2 69 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.81 0.75 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.83 0.87 0.85 5000 gt 0.80 0.82 0.81 5000 hn 0.82 0.77 0.80 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.82 5000 py 0.83 0.84 0.83 5000 sv 0.81 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.80 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12768) (76653, 12768) (356449, 12737) (76653, 12737) STARTING UNMASKING ROUND 70 spa twitter cxg2 70 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.81 0.75 0.78 4853 cr 0.82 0.76 0.79 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.83 0.87 0.85 5000 gt 0.79 0.82 0.81 5000 hn 0.82 0.77 0.79 3780 mx 0.76 0.78 0.77 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.82 5000 py 0.82 0.84 0.83 5000 sv 0.81 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.80 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12737) (76653, 12737) (356449, 12705) (76653, 12705) STARTING UNMASKING ROUND 71 spa twitter cxg2 71 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.81 0.75 0.78 4853 cr 0.81 0.76 0.78 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.82 0.77 0.79 3780 mx 0.75 0.77 0.76 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.82 5000 py 0.83 0.84 0.83 5000 sv 0.81 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.80 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12705) (76653, 12705) (356449, 12675) (76653, 12675) STARTING UNMASKING ROUND 72 spa twitter cxg2 72 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.80 0.75 0.77 4853 cr 0.81 0.75 0.78 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.82 0.77 0.79 3780 mx 0.76 0.77 0.76 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.82 5000 py 0.82 0.84 0.83 5000 sv 0.81 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.80 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12675) (76653, 12675) (356449, 12646) (76653, 12646) STARTING UNMASKING ROUND 73 spa twitter cxg2 73 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.81 0.75 0.77 4853 cr 0.81 0.75 0.78 4967 cu 0.93 0.91 0.92 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.82 0.77 0.79 3780 mx 0.75 0.77 0.76 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.82 5000 py 0.82 0.84 0.83 5000 sv 0.81 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.76 0.77 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12646) (76653, 12646) (356449, 12614) (76653, 12614) STARTING UNMASKING ROUND 74 spa twitter cxg2 74 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.87 0.89 0.88 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.75 0.78 4967 cu 0.93 0.91 0.92 3503 ec 0.90 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.82 0.77 0.79 3780 mx 0.75 0.77 0.76 5000 ni 0.91 0.92 0.92 4550 pa 0.81 0.82 0.81 5000 py 0.82 0.84 0.83 5000 sv 0.80 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.76 0.78 5000 avg / total 0.82 0.82 0.82 76653 Reducing feature vectors. (356449, 12614) (76653, 12614) (356449, 12582) (76653, 12582) STARTING UNMASKING ROUND 75 spa twitter cxg2 75 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.86 0.89 0.88 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.93 0.91 0.92 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.81 0.76 0.79 3780 mx 0.75 0.77 0.76 5000 ni 0.91 0.92 0.91 4550 pa 0.81 0.82 0.81 5000 py 0.83 0.84 0.83 5000 sv 0.80 0.81 0.81 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.76 0.77 5000 avg / total 0.82 0.82 0.81 76653 Reducing feature vectors. (356449, 12582) (76653, 12582) (356449, 12551) (76653, 12551) STARTING UNMASKING ROUND 76 spa twitter cxg2 76 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.86 0.89 0.87 5000 co 0.80 0.75 0.77 4853 cr 0.81 0.76 0.79 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.81 0.76 0.79 3780 mx 0.75 0.77 0.76 5000 ni 0.91 0.92 0.91 4550 pa 0.81 0.82 0.81 5000 py 0.82 0.84 0.83 5000 sv 0.80 0.81 0.80 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12551) (76653, 12551) (356449, 12521) (76653, 12521) STARTING UNMASKING ROUND 77 spa twitter cxg2 77 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.86 0.88 0.87 5000 co 0.80 0.75 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.81 0.80 5000 hn 0.81 0.76 0.78 3780 mx 0.75 0.76 0.76 5000 ni 0.90 0.92 0.91 4550 pa 0.81 0.82 0.81 5000 py 0.82 0.84 0.83 5000 sv 0.80 0.80 0.80 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12521) (76653, 12521) (356449, 12491) (76653, 12491) STARTING UNMASKING ROUND 78 spa twitter cxg2 78 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.86 0.89 0.87 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.93 0.90 0.91 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.78 0.80 0.79 5000 hn 0.81 0.76 0.78 3780 mx 0.75 0.76 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.81 5000 py 0.82 0.83 0.83 5000 sv 0.80 0.80 0.80 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12491) (76653, 12491) (356449, 12459) (76653, 12459) STARTING UNMASKING ROUND 79 spa twitter cxg2 79 precision recall f1-score support ar 0.75 0.79 0.77 5000 cl 0.86 0.88 0.87 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.93 0.90 0.91 3503 ec 0.89 0.93 0.91 5000 es 0.82 0.87 0.84 5000 gt 0.79 0.80 0.79 5000 hn 0.81 0.76 0.78 3780 mx 0.75 0.76 0.76 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.81 5000 py 0.83 0.83 0.83 5000 sv 0.80 0.80 0.80 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12459) (76653, 12459) (356449, 12433) (76653, 12433) STARTING UNMASKING ROUND 80 spa twitter cxg2 80 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.86 0.88 0.87 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.91 5000 es 0.82 0.86 0.84 5000 gt 0.78 0.80 0.79 5000 hn 0.81 0.76 0.78 3780 mx 0.75 0.76 0.76 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.81 5000 py 0.82 0.83 0.83 5000 sv 0.80 0.80 0.80 5000 uy 0.71 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12433) (76653, 12433) (356449, 12402) (76653, 12402) STARTING UNMASKING ROUND 81 spa twitter cxg2 81 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.86 0.88 0.87 5000 co 0.80 0.74 0.77 4853 cr 0.81 0.76 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.90 5000 es 0.81 0.86 0.84 5000 gt 0.78 0.80 0.79 5000 hn 0.81 0.76 0.78 3780 mx 0.74 0.76 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.81 5000 py 0.82 0.83 0.83 5000 sv 0.79 0.80 0.80 5000 uy 0.72 0.70 0.71 5000 ve 0.79 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12402) (76653, 12402) (356449, 12372) (76653, 12372) STARTING UNMASKING ROUND 82 spa twitter cxg2 82 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.87 5000 co 0.79 0.74 0.76 4853 cr 0.81 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.90 5000 es 0.81 0.86 0.83 5000 gt 0.78 0.80 0.79 5000 hn 0.81 0.75 0.78 3780 mx 0.74 0.75 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.80 5000 py 0.82 0.83 0.83 5000 sv 0.79 0.80 0.79 5000 uy 0.72 0.70 0.71 5000 ve 0.78 0.75 0.77 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12372) (76653, 12372) (356449, 12340) (76653, 12340) STARTING UNMASKING ROUND 83 spa twitter cxg2 83 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.87 5000 co 0.79 0.73 0.76 4853 cr 0.81 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.90 5000 es 0.81 0.86 0.83 5000 gt 0.78 0.80 0.79 5000 hn 0.81 0.75 0.78 3780 mx 0.74 0.75 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.80 5000 py 0.82 0.83 0.83 5000 sv 0.79 0.80 0.79 5000 uy 0.72 0.70 0.71 5000 ve 0.78 0.75 0.76 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12340) (76653, 12340) (356449, 12309) (76653, 12309) STARTING UNMASKING ROUND 84 spa twitter cxg2 84 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.87 5000 co 0.79 0.73 0.76 4853 cr 0.81 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.90 5000 es 0.81 0.86 0.83 5000 gt 0.78 0.80 0.79 5000 hn 0.80 0.74 0.77 3780 mx 0.74 0.75 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.80 0.81 0.80 5000 py 0.82 0.83 0.82 5000 sv 0.79 0.80 0.79 5000 uy 0.72 0.70 0.71 5000 ve 0.78 0.74 0.76 5000 avg / total 0.81 0.81 0.81 76653 Reducing feature vectors. (356449, 12309) (76653, 12309) (356449, 12277) (76653, 12277) STARTING UNMASKING ROUND 85 spa twitter cxg2 85 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.73 0.76 4853 cr 0.81 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.89 0.92 0.90 5000 es 0.81 0.85 0.83 5000 gt 0.78 0.80 0.79 5000 hn 0.80 0.74 0.77 3780 mx 0.74 0.75 0.75 5000 ni 0.90 0.92 0.91 4550 pa 0.79 0.81 0.80 5000 py 0.82 0.83 0.82 5000 sv 0.79 0.79 0.79 5000 uy 0.72 0.70 0.71 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.81 0.80 76653 Reducing feature vectors. (356449, 12277) (76653, 12277) (356449, 12248) (76653, 12248) STARTING UNMASKING ROUND 86 spa twitter cxg2 86 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.73 0.76 4853 cr 0.80 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.88 0.92 0.90 5000 es 0.81 0.86 0.83 5000 gt 0.78 0.80 0.79 5000 hn 0.80 0.74 0.77 3780 mx 0.74 0.75 0.74 5000 ni 0.90 0.92 0.91 4550 pa 0.79 0.81 0.80 5000 py 0.82 0.83 0.82 5000 sv 0.79 0.79 0.79 5000 uy 0.72 0.70 0.71 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12248) (76653, 12248) (356449, 12217) (76653, 12217) STARTING UNMASKING ROUND 87 spa twitter cxg2 87 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.73 0.76 4853 cr 0.81 0.75 0.78 4967 cu 0.92 0.90 0.91 3503 ec 0.88 0.92 0.90 5000 es 0.80 0.85 0.83 5000 gt 0.77 0.80 0.78 5000 hn 0.80 0.74 0.77 3780 mx 0.74 0.75 0.74 5000 ni 0.90 0.92 0.91 4550 pa 0.79 0.80 0.80 5000 py 0.82 0.83 0.82 5000 sv 0.79 0.79 0.79 5000 uy 0.71 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12217) (76653, 12217) (356449, 12186) (76653, 12186) STARTING UNMASKING ROUND 88 spa twitter cxg2 88 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.73 0.76 4853 cr 0.80 0.75 0.77 4967 cu 0.92 0.90 0.91 3503 ec 0.88 0.92 0.90 5000 es 0.80 0.85 0.83 5000 gt 0.77 0.79 0.78 5000 hn 0.80 0.74 0.77 3780 mx 0.74 0.75 0.74 5000 ni 0.89 0.92 0.91 4550 pa 0.79 0.80 0.80 5000 py 0.81 0.83 0.82 5000 sv 0.78 0.79 0.79 5000 uy 0.72 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12186) (76653, 12186) (356449, 12155) (76653, 12155) STARTING UNMASKING ROUND 89 spa twitter cxg2 89 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.72 0.75 4853 cr 0.80 0.75 0.77 4967 cu 0.92 0.90 0.91 3503 ec 0.88 0.91 0.90 5000 es 0.80 0.85 0.83 5000 gt 0.77 0.79 0.78 5000 hn 0.80 0.73 0.76 3780 mx 0.74 0.75 0.74 5000 ni 0.89 0.92 0.90 4550 pa 0.79 0.80 0.80 5000 py 0.81 0.83 0.82 5000 sv 0.78 0.79 0.79 5000 uy 0.71 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12155) (76653, 12155) (356449, 12123) (76653, 12123) STARTING UNMASKING ROUND 90 spa twitter cxg2 90 precision recall f1-score support ar 0.74 0.78 0.76 5000 cl 0.85 0.88 0.86 5000 co 0.79 0.72 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.90 0.91 3503 ec 0.88 0.91 0.90 5000 es 0.80 0.85 0.83 5000 gt 0.77 0.79 0.78 5000 hn 0.80 0.73 0.76 3780 mx 0.74 0.75 0.74 5000 ni 0.89 0.92 0.90 4550 pa 0.79 0.80 0.79 5000 py 0.81 0.83 0.82 5000 sv 0.78 0.79 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12123) (76653, 12123) (356449, 12091) (76653, 12091) STARTING UNMASKING ROUND 91 spa twitter cxg2 91 precision recall f1-score support ar 0.73 0.78 0.76 5000 cl 0.84 0.88 0.86 5000 co 0.79 0.72 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.89 0.91 3503 ec 0.88 0.92 0.90 5000 es 0.80 0.86 0.83 5000 gt 0.77 0.79 0.78 5000 hn 0.80 0.73 0.76 3780 mx 0.74 0.74 0.74 5000 ni 0.89 0.92 0.90 4550 pa 0.79 0.80 0.79 5000 py 0.81 0.83 0.82 5000 sv 0.78 0.79 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12091) (76653, 12091) (356449, 12059) (76653, 12059) STARTING UNMASKING ROUND 92 spa twitter cxg2 92 precision recall f1-score support ar 0.74 0.78 0.75 5000 cl 0.84 0.88 0.86 5000 co 0.79 0.72 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.89 0.91 3503 ec 0.88 0.91 0.90 5000 es 0.80 0.85 0.83 5000 gt 0.77 0.79 0.78 5000 hn 0.80 0.73 0.76 3780 mx 0.74 0.75 0.74 5000 ni 0.89 0.92 0.90 4550 pa 0.79 0.80 0.80 5000 py 0.81 0.82 0.82 5000 sv 0.78 0.79 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.77 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12059) (76653, 12059) (356449, 12027) (76653, 12027) STARTING UNMASKING ROUND 93 spa twitter cxg2 93 precision recall f1-score support ar 0.73 0.78 0.75 5000 cl 0.84 0.88 0.86 5000 co 0.78 0.72 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.89 0.90 3503 ec 0.88 0.92 0.90 5000 es 0.80 0.85 0.82 5000 gt 0.77 0.79 0.78 5000 hn 0.79 0.73 0.76 3780 mx 0.74 0.75 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.79 0.80 0.79 5000 py 0.81 0.82 0.82 5000 sv 0.78 0.79 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.77 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 12027) (76653, 12027) (356449, 11996) (76653, 11996) STARTING UNMASKING ROUND 94 spa twitter cxg2 94 precision recall f1-score support ar 0.74 0.77 0.75 5000 cl 0.84 0.87 0.86 5000 co 0.78 0.72 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.89 0.90 3503 ec 0.88 0.92 0.90 5000 es 0.80 0.85 0.82 5000 gt 0.77 0.79 0.78 5000 hn 0.79 0.73 0.76 3780 mx 0.73 0.75 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.79 0.80 0.79 5000 py 0.81 0.82 0.82 5000 sv 0.78 0.78 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.77 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 11996) (76653, 11996) (356449, 11964) (76653, 11964) STARTING UNMASKING ROUND 95 spa twitter cxg2 95 precision recall f1-score support ar 0.74 0.77 0.75 5000 cl 0.84 0.88 0.86 5000 co 0.78 0.71 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.92 0.89 0.90 3503 ec 0.88 0.91 0.89 5000 es 0.80 0.85 0.82 5000 gt 0.77 0.78 0.78 5000 hn 0.79 0.73 0.76 3780 mx 0.73 0.74 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.78 0.79 0.79 5000 py 0.81 0.82 0.82 5000 sv 0.77 0.79 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.77 0.74 0.76 5000 avg / total 0.80 0.80 0.80 76653 Reducing feature vectors. (356449, 11964) (76653, 11964) (356449, 11932) (76653, 11932) STARTING UNMASKING ROUND 96 spa twitter cxg2 96 precision recall f1-score support ar 0.73 0.78 0.75 5000 cl 0.84 0.87 0.86 5000 co 0.78 0.71 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.91 0.89 0.90 3503 ec 0.87 0.91 0.89 5000 es 0.80 0.85 0.82 5000 gt 0.76 0.78 0.77 5000 hn 0.79 0.73 0.76 3780 mx 0.73 0.74 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.78 0.79 0.79 5000 py 0.81 0.82 0.82 5000 sv 0.77 0.79 0.78 5000 uy 0.71 0.68 0.70 5000 ve 0.77 0.74 0.76 5000 avg / total 0.80 0.80 0.79 76653 Reducing feature vectors. (356449, 11932) (76653, 11932) (356449, 11901) (76653, 11901) STARTING UNMASKING ROUND 97 spa twitter cxg2 97 precision recall f1-score support ar 0.73 0.77 0.75 5000 cl 0.84 0.87 0.86 5000 co 0.78 0.71 0.75 4853 cr 0.80 0.74 0.77 4967 cu 0.91 0.89 0.90 3503 ec 0.87 0.91 0.89 5000 es 0.80 0.85 0.82 5000 gt 0.76 0.78 0.77 5000 hn 0.79 0.73 0.76 3780 mx 0.73 0.74 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.79 0.80 0.79 5000 py 0.81 0.82 0.81 5000 sv 0.77 0.78 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.78 0.74 0.76 5000 avg / total 0.79 0.79 0.79 76653 Reducing feature vectors. (356449, 11901) (76653, 11901) (356449, 11869) (76653, 11869) STARTING UNMASKING ROUND 98 spa twitter cxg2 98 precision recall f1-score support ar 0.73 0.78 0.75 5000 cl 0.84 0.87 0.85 5000 co 0.78 0.71 0.74 4853 cr 0.80 0.74 0.77 4967 cu 0.91 0.89 0.90 3503 ec 0.87 0.91 0.89 5000 es 0.79 0.85 0.82 5000 gt 0.76 0.78 0.77 5000 hn 0.79 0.72 0.76 3780 mx 0.73 0.74 0.74 5000 ni 0.89 0.91 0.90 4550 pa 0.78 0.79 0.79 5000 py 0.81 0.82 0.81 5000 sv 0.77 0.78 0.78 5000 uy 0.71 0.69 0.70 5000 ve 0.77 0.74 0.75 5000 avg / total 0.79 0.79 0.79 76653 Reducing feature vectors. (356449, 11869) (76653, 11869) (356449, 11837) (76653, 11837) STARTING UNMASKING ROUND 99 spa twitter cxg2 99 precision recall f1-score support ar 0.73 0.77 0.75 5000 cl 0.84 0.88 0.86 5000 co 0.78 0.71 0.74 4853 cr 0.79 0.73 0.76 4967 cu 0.92 0.89 0.90 3503 ec 0.87 0.91 0.89 5000 es 0.79 0.85 0.82 5000 gt 0.76 0.78 0.77 5000 hn 0.79 0.72 0.76 3780 mx 0.73 0.74 0.73 5000 ni 0.89 0.91 0.90 4550 pa 0.78 0.79 0.79 5000 py 0.81 0.82 0.81 5000 sv 0.77 0.78 0.78 5000 uy 0.70 0.69 0.70 5000 ve 0.77 0.74 0.75 5000 avg / total 0.79 0.79 0.79 76653 Reducing feature vectors. (356449, 11837) (76653, 11837) (356449, 11805) (76653, 11805) STARTING UNMASKING ROUND 100 spa twitter cxg2 100 precision recall f1-score support ar 0.73 0.77 0.75 5000 cl 0.84 0.87 0.86 5000 co 0.78 0.71 0.74 4853 cr 0.79 0.73 0.76 4967 cu 0.91 0.89 0.90 3503 ec 0.87 0.91 0.89 5000 es 0.79 0.85 0.82 5000 gt 0.76 0.78 0.77 5000 hn 0.79 0.72 0.75 3780 mx 0.73 0.74 0.73 5000 ni 0.89 0.91 0.90 4550 pa 0.78 0.79 0.78 5000 py 0.81 0.82 0.81 5000 sv 0.77 0.78 0.78 5000 uy 0.70 0.68 0.69 5000 ve 0.77 0.74 0.75 5000 avg / total 0.79 0.79 0.79 76653 Reducing feature vectors. (356449, 11805) (76653, 11805) (356449, 11773) (76653, 11773)