Starting cxg2 and cc (34000, 14846) 34000 {'C': 0.01, 'loss': 'squared_hinge'} STARTING UNMASKING ROUND 1 spa cc cxg2 1 precision recall f1-score support ar 0.94 0.94 0.94 5000 cl 0.99 0.98 0.98 5000 co 0.95 0.94 0.95 5000 cr 1.00 1.00 1.00 5000 cu 0.96 0.97 0.97 5000 ec 0.96 0.96 0.96 5000 es 0.94 0.95 0.94 5000 gt 0.96 0.96 0.96 5000 hn 0.93 0.94 0.94 5000 mx 0.94 0.93 0.93 5000 ni 0.92 0.86 0.89 4093 pa 0.98 0.98 0.98 5000 pe 0.94 0.92 0.93 5000 py 0.94 0.96 0.95 5000 sv 0.95 0.94 0.95 5000 uy 0.91 0.93 0.92 5000 ve 0.97 0.98 0.98 5000 avg / total 0.95 0.95 0.95 84093 Reducing feature vectors. (416372, 14846) (84093, 14846) (416372, 14823) (84093, 14823) STARTING UNMASKING ROUND 2 spa cc cxg2 2 precision recall f1-score support ar 0.93 0.93 0.93 5000 cl 0.99 0.98 0.98 5000 co 0.95 0.94 0.94 5000 cr 1.00 1.00 1.00 5000 cu 0.96 0.97 0.96 5000 ec 0.95 0.95 0.95 5000 es 0.94 0.95 0.94 5000 gt 0.95 0.95 0.95 5000 hn 0.92 0.93 0.93 5000 mx 0.94 0.92 0.93 5000 ni 0.89 0.82 0.85 4093 pa 0.97 0.97 0.97 5000 pe 0.92 0.91 0.92 5000 py 0.92 0.95 0.93 5000 sv 0.94 0.94 0.94 5000 uy 0.89 0.91 0.90 5000 ve 0.96 0.98 0.97 5000 avg / total 0.94 0.94 0.94 84093 Reducing feature vectors. (416372, 14823) (84093, 14823) (416372, 14795) (84093, 14795) STARTING UNMASKING ROUND 3 spa cc cxg2 3 precision recall f1-score support ar 0.92 0.93 0.92 5000 cl 0.98 0.97 0.98 5000 co 0.94 0.92 0.93 5000 cr 1.00 1.00 1.00 5000 cu 0.95 0.97 0.96 5000 ec 0.94 0.94 0.94 5000 es 0.94 0.95 0.94 5000 gt 0.94 0.94 0.94 5000 hn 0.90 0.91 0.90 5000 mx 0.93 0.92 0.92 5000 ni 0.88 0.80 0.84 4093 pa 0.96 0.97 0.96 5000 pe 0.92 0.91 0.91 5000 py 0.92 0.94 0.93 5000 sv 0.93 0.93 0.93 5000 uy 0.87 0.90 0.89 5000 ve 0.96 0.97 0.97 5000 avg / total 0.93 0.93 0.93 84093 Reducing feature vectors. (416372, 14795) (84093, 14795) (416372, 14766) (84093, 14766) STARTING UNMASKING ROUND 4 spa cc cxg2 4 precision recall f1-score support ar 0.92 0.92 0.92 5000 cl 0.98 0.97 0.98 5000 co 0.93 0.92 0.93 5000 cr 1.00 1.00 1.00 5000 cu 0.95 0.97 0.96 5000 ec 0.92 0.93 0.93 5000 es 0.93 0.95 0.94 5000 gt 0.94 0.94 0.94 5000 hn 0.89 0.90 0.90 5000 mx 0.93 0.91 0.92 5000 ni 0.88 0.80 0.83 4093 pa 0.95 0.96 0.96 5000 pe 0.90 0.90 0.90 5000 py 0.91 0.94 0.92 5000 sv 0.92 0.93 0.93 5000 uy 0.87 0.89 0.88 5000 ve 0.96 0.97 0.96 5000 avg / total 0.93 0.93 0.93 84093 Reducing feature vectors. (416372, 14766) (84093, 14766) (416372, 14737) (84093, 14737) STARTING UNMASKING ROUND 5 spa cc cxg2 5 precision recall f1-score support ar 0.92 0.91 0.92 5000 cl 0.98 0.97 0.98 5000 co 0.93 0.92 0.92 5000 cr 1.00 1.00 1.00 5000 cu 0.94 0.96 0.95 5000 ec 0.92 0.92 0.92 5000 es 0.93 0.94 0.94 5000 gt 0.93 0.93 0.93 5000 hn 0.89 0.90 0.90 5000 mx 0.93 0.91 0.92 5000 ni 0.87 0.79 0.83 4093 pa 0.95 0.96 0.95 5000 pe 0.90 0.89 0.90 5000 py 0.91 0.93 0.92 5000 sv 0.92 0.93 0.93 5000 uy 0.86 0.89 0.88 5000 ve 0.96 0.97 0.96 5000 avg / total 0.93 0.93 0.93 84093 Reducing feature vectors. (416372, 14737) (84093, 14737) (416372, 14707) (84093, 14707) STARTING UNMASKING ROUND 6 spa cc cxg2 6 precision recall f1-score support ar 0.91 0.91 0.91 5000 cl 0.98 0.97 0.98 5000 co 0.93 0.91 0.92 5000 cr 1.00 1.00 1.00 5000 cu 0.94 0.96 0.95 5000 ec 0.92 0.92 0.92 5000 es 0.93 0.94 0.94 5000 gt 0.93 0.93 0.93 5000 hn 0.89 0.89 0.89 5000 mx 0.92 0.90 0.91 5000 ni 0.87 0.79 0.82 4093 pa 0.95 0.96 0.95 5000 pe 0.90 0.89 0.89 5000 py 0.91 0.93 0.92 5000 sv 0.92 0.93 0.92 5000 uy 0.86 0.89 0.88 5000 ve 0.95 0.96 0.96 5000 avg / total 0.92 0.92 0.92 84093 Reducing feature vectors. (416372, 14707) (84093, 14707) (416372, 14674) (84093, 14674) STARTING UNMASKING ROUND 7 spa cc cxg2 7 precision recall f1-score support ar 0.91 0.91 0.91 5000 cl 0.98 0.97 0.97 5000 co 0.92 0.91 0.92 5000 cr 1.00 1.00 1.00 5000 cu 0.94 0.96 0.95 5000 ec 0.91 0.92 0.92 5000 es 0.93 0.94 0.94 5000 gt 0.93 0.93 0.93 5000 hn 0.89 0.89 0.89 5000 mx 0.92 0.90 0.91 5000 ni 0.87 0.78 0.82 4093 pa 0.95 0.96 0.95 5000 pe 0.89 0.89 0.89 5000 py 0.91 0.93 0.92 5000 sv 0.92 0.93 0.92 5000 uy 0.86 0.88 0.87 5000 ve 0.95 0.96 0.96 5000 avg / total 0.92 0.92 0.92 84093 Reducing feature vectors. (416372, 14674) (84093, 14674) (416372, 14642) (84093, 14642) STARTING UNMASKING ROUND 8 spa cc cxg2 8 precision recall f1-score support ar 0.91 0.91 0.91 5000 cl 0.98 0.96 0.97 5000 co 0.92 0.90 0.91 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.96 0.94 5000 ec 0.91 0.91 0.91 5000 es 0.93 0.94 0.93 5000 gt 0.93 0.93 0.93 5000 hn 0.88 0.89 0.89 5000 mx 0.92 0.90 0.91 5000 ni 0.86 0.77 0.81 4093 pa 0.94 0.95 0.95 5000 pe 0.89 0.88 0.89 5000 py 0.90 0.93 0.92 5000 sv 0.92 0.93 0.92 5000 uy 0.86 0.88 0.87 5000 ve 0.95 0.96 0.95 5000 avg / total 0.92 0.92 0.92 84093 Reducing feature vectors. (416372, 14642) (84093, 14642) (416372, 14612) (84093, 14612) STARTING UNMASKING ROUND 9 spa cc cxg2 9 precision recall f1-score support ar 0.91 0.91 0.91 5000 cl 0.98 0.96 0.97 5000 co 0.91 0.90 0.91 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.96 0.94 5000 ec 0.91 0.91 0.91 5000 es 0.93 0.94 0.93 5000 gt 0.92 0.93 0.93 5000 hn 0.88 0.89 0.88 5000 mx 0.91 0.90 0.91 5000 ni 0.86 0.76 0.81 4093 pa 0.94 0.95 0.94 5000 pe 0.89 0.88 0.88 5000 py 0.90 0.93 0.91 5000 sv 0.91 0.92 0.92 5000 uy 0.85 0.88 0.87 5000 ve 0.95 0.96 0.95 5000 avg / total 0.92 0.92 0.92 84093 Reducing feature vectors. (416372, 14612) (84093, 14612) (416372, 14583) (84093, 14583) STARTING UNMASKING ROUND 10 spa cc cxg2 10 precision recall f1-score support ar 0.90 0.91 0.90 5000 cl 0.98 0.96 0.97 5000 co 0.90 0.89 0.90 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.95 0.94 5000 ec 0.90 0.91 0.90 5000 es 0.93 0.94 0.93 5000 gt 0.92 0.93 0.92 5000 hn 0.87 0.88 0.88 5000 mx 0.91 0.89 0.90 5000 ni 0.85 0.75 0.80 4093 pa 0.94 0.95 0.94 5000 pe 0.88 0.88 0.88 5000 py 0.90 0.92 0.91 5000 sv 0.91 0.92 0.91 5000 uy 0.85 0.88 0.86 5000 ve 0.94 0.95 0.95 5000 avg / total 0.91 0.91 0.91 84093 Reducing feature vectors. (416372, 14583) (84093, 14583) (416372, 14552) (84093, 14552) STARTING UNMASKING ROUND 11 spa cc cxg2 11 precision recall f1-score support ar 0.90 0.90 0.90 5000 cl 0.98 0.96 0.97 5000 co 0.90 0.88 0.89 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.95 0.94 5000 ec 0.90 0.90 0.90 5000 es 0.93 0.93 0.93 5000 gt 0.92 0.92 0.92 5000 hn 0.87 0.88 0.88 5000 mx 0.91 0.89 0.90 5000 ni 0.84 0.74 0.79 4093 pa 0.94 0.95 0.94 5000 pe 0.88 0.87 0.88 5000 py 0.90 0.92 0.91 5000 sv 0.90 0.92 0.91 5000 uy 0.85 0.87 0.86 5000 ve 0.94 0.95 0.94 5000 avg / total 0.91 0.91 0.91 84093 Reducing feature vectors. (416372, 14552) (84093, 14552) (416372, 14523) (84093, 14523) STARTING UNMASKING ROUND 12 spa cc cxg2 12 precision recall f1-score support ar 0.90 0.90 0.90 5000 cl 0.97 0.96 0.97 5000 co 0.90 0.88 0.89 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.95 0.94 5000 ec 0.90 0.90 0.90 5000 es 0.92 0.93 0.93 5000 gt 0.90 0.92 0.91 5000 hn 0.88 0.87 0.87 5000 mx 0.91 0.89 0.90 5000 ni 0.84 0.74 0.79 4093 pa 0.93 0.95 0.94 5000 pe 0.87 0.87 0.87 5000 py 0.89 0.92 0.91 5000 sv 0.90 0.92 0.91 5000 uy 0.84 0.87 0.86 5000 ve 0.93 0.95 0.94 5000 avg / total 0.91 0.91 0.91 84093 Reducing feature vectors. (416372, 14523) (84093, 14523) (416372, 14493) (84093, 14493) STARTING UNMASKING ROUND 13 spa cc cxg2 13 precision recall f1-score support ar 0.89 0.90 0.90 5000 cl 0.97 0.96 0.97 5000 co 0.89 0.87 0.88 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.95 0.94 5000 ec 0.89 0.90 0.89 5000 es 0.92 0.93 0.93 5000 gt 0.90 0.91 0.91 5000 hn 0.87 0.87 0.87 5000 mx 0.90 0.88 0.89 5000 ni 0.83 0.73 0.78 4093 pa 0.93 0.94 0.94 5000 pe 0.87 0.86 0.87 5000 py 0.89 0.92 0.90 5000 sv 0.89 0.91 0.90 5000 uy 0.84 0.86 0.85 5000 ve 0.93 0.94 0.94 5000 avg / total 0.90 0.90 0.90 84093 Reducing feature vectors. (416372, 14493) (84093, 14493) (416372, 14459) (84093, 14459) STARTING UNMASKING ROUND 14 spa cc cxg2 14 precision recall f1-score support ar 0.89 0.90 0.89 5000 cl 0.97 0.96 0.96 5000 co 0.89 0.87 0.88 5000 cr 1.00 1.00 1.00 5000 cu 0.93 0.95 0.94 5000 ec 0.89 0.90 0.89 5000 es 0.92 0.93 0.93 5000 gt 0.89 0.91 0.90 5000 hn 0.87 0.86 0.86 5000 mx 0.90 0.87 0.88 5000 ni 0.83 0.73 0.78 4093 pa 0.93 0.94 0.94 5000 pe 0.87 0.86 0.86 5000 py 0.89 0.91 0.90 5000 sv 0.89 0.91 0.90 5000 uy 0.84 0.86 0.85 5000 ve 0.93 0.94 0.94 5000 avg / total 0.90 0.90 0.90 84093 Reducing feature vectors. (416372, 14459) (84093, 14459) (416372, 14428) (84093, 14428) STARTING UNMASKING ROUND 15 spa cc cxg2 15 precision recall f1-score support ar 0.89 0.89 0.89 5000 cl 0.97 0.96 0.97 5000 co 0.88 0.87 0.87 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.89 0.89 0.89 5000 es 0.92 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.86 0.86 0.86 5000 mx 0.89 0.87 0.88 5000 ni 0.83 0.72 0.77 4093 pa 0.93 0.94 0.93 5000 pe 0.87 0.86 0.86 5000 py 0.89 0.91 0.90 5000 sv 0.89 0.91 0.90 5000 uy 0.84 0.86 0.85 5000 ve 0.93 0.94 0.93 5000 avg / total 0.90 0.90 0.90 84093 Reducing feature vectors. (416372, 14428) (84093, 14428) (416372, 14396) (84093, 14396) STARTING UNMASKING ROUND 16 spa cc cxg2 16 precision recall f1-score support ar 0.89 0.89 0.89 5000 cl 0.97 0.96 0.96 5000 co 0.88 0.86 0.87 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.89 5000 es 0.92 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.85 0.85 0.85 5000 mx 0.89 0.86 0.88 5000 ni 0.82 0.71 0.76 4093 pa 0.93 0.94 0.93 5000 pe 0.86 0.86 0.86 5000 py 0.88 0.91 0.90 5000 sv 0.89 0.91 0.90 5000 uy 0.84 0.85 0.84 5000 ve 0.93 0.94 0.93 5000 avg / total 0.90 0.90 0.90 84093 Reducing feature vectors. (416372, 14396) (84093, 14396) (416372, 14366) (84093, 14366) STARTING UNMASKING ROUND 17 spa cc cxg2 17 precision recall f1-score support ar 0.88 0.89 0.89 5000 cl 0.97 0.96 0.96 5000 co 0.88 0.85 0.87 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.89 5000 es 0.91 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.85 0.85 0.85 5000 mx 0.89 0.86 0.87 5000 ni 0.82 0.71 0.76 4093 pa 0.92 0.93 0.92 5000 pe 0.86 0.85 0.86 5000 py 0.88 0.91 0.89 5000 sv 0.89 0.90 0.89 5000 uy 0.83 0.85 0.84 5000 ve 0.93 0.94 0.93 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14366) (84093, 14366) (416372, 14333) (84093, 14333) STARTING UNMASKING ROUND 18 spa cc cxg2 18 precision recall f1-score support ar 0.88 0.89 0.89 5000 cl 0.97 0.96 0.96 5000 co 0.88 0.85 0.86 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.84 0.85 0.85 5000 mx 0.89 0.86 0.87 5000 ni 0.81 0.71 0.76 4093 pa 0.92 0.93 0.92 5000 pe 0.86 0.85 0.85 5000 py 0.88 0.91 0.89 5000 sv 0.88 0.90 0.89 5000 uy 0.83 0.85 0.84 5000 ve 0.92 0.94 0.93 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14333) (84093, 14333) (416372, 14301) (84093, 14301) STARTING UNMASKING ROUND 19 spa cc cxg2 19 precision recall f1-score support ar 0.88 0.89 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.85 0.86 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.89 0.91 0.90 5000 hn 0.84 0.85 0.84 5000 mx 0.88 0.86 0.87 5000 ni 0.81 0.70 0.75 4093 pa 0.92 0.93 0.92 5000 pe 0.86 0.85 0.85 5000 py 0.88 0.90 0.89 5000 sv 0.88 0.90 0.89 5000 uy 0.83 0.85 0.84 5000 ve 0.92 0.93 0.93 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14301) (84093, 14301) (416372, 14269) (84093, 14269) STARTING UNMASKING ROUND 20 spa cc cxg2 20 precision recall f1-score support ar 0.88 0.89 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.84 0.86 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.88 0.90 0.89 5000 hn 0.84 0.85 0.85 5000 mx 0.88 0.86 0.87 5000 ni 0.81 0.70 0.75 4093 pa 0.92 0.92 0.92 5000 pe 0.85 0.85 0.85 5000 py 0.87 0.90 0.89 5000 sv 0.88 0.90 0.89 5000 uy 0.83 0.84 0.84 5000 ve 0.92 0.93 0.93 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14269) (84093, 14269) (416372, 14237) (84093, 14237) STARTING UNMASKING ROUND 21 spa cc cxg2 21 precision recall f1-score support ar 0.88 0.88 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.84 0.85 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.88 0.89 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.88 0.90 0.89 5000 hn 0.84 0.84 0.84 5000 mx 0.88 0.85 0.87 5000 ni 0.81 0.69 0.75 4093 pa 0.92 0.92 0.92 5000 pe 0.85 0.85 0.85 5000 py 0.87 0.90 0.89 5000 sv 0.87 0.90 0.89 5000 uy 0.83 0.84 0.83 5000 ve 0.92 0.93 0.93 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14237) (84093, 14237) (416372, 14203) (84093, 14203) STARTING UNMASKING ROUND 22 spa cc cxg2 22 precision recall f1-score support ar 0.87 0.88 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.84 0.85 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.87 0.88 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.88 0.90 0.89 5000 hn 0.84 0.84 0.84 5000 mx 0.88 0.85 0.86 5000 ni 0.81 0.69 0.75 4093 pa 0.91 0.92 0.92 5000 pe 0.85 0.84 0.84 5000 py 0.87 0.90 0.88 5000 sv 0.87 0.90 0.89 5000 uy 0.83 0.84 0.83 5000 ve 0.92 0.93 0.92 5000 avg / total 0.89 0.89 0.89 84093 Reducing feature vectors. (416372, 14203) (84093, 14203) (416372, 14172) (84093, 14172) STARTING UNMASKING ROUND 23 spa cc cxg2 23 precision recall f1-score support ar 0.87 0.88 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.84 0.85 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.87 0.88 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.88 0.90 0.89 5000 hn 0.84 0.84 0.84 5000 mx 0.88 0.85 0.86 5000 ni 0.81 0.69 0.74 4093 pa 0.91 0.92 0.91 5000 pe 0.85 0.84 0.84 5000 py 0.87 0.90 0.88 5000 sv 0.86 0.90 0.88 5000 uy 0.82 0.84 0.83 5000 ve 0.91 0.93 0.92 5000 avg / total 0.88 0.89 0.88 84093 Reducing feature vectors. (416372, 14172) (84093, 14172) (416372, 14142) (84093, 14142) STARTING UNMASKING ROUND 24 spa cc cxg2 24 precision recall f1-score support ar 0.87 0.88 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.87 0.83 0.85 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.87 0.88 0.88 5000 es 0.91 0.93 0.92 5000 gt 0.88 0.90 0.89 5000 hn 0.84 0.84 0.84 5000 mx 0.87 0.85 0.86 5000 ni 0.81 0.69 0.74 4093 pa 0.91 0.92 0.91 5000 pe 0.84 0.84 0.84 5000 py 0.87 0.90 0.88 5000 sv 0.86 0.89 0.88 5000 uy 0.82 0.84 0.83 5000 ve 0.91 0.93 0.92 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 14142) (84093, 14142) (416372, 14108) (84093, 14108) STARTING UNMASKING ROUND 25 spa cc cxg2 25 precision recall f1-score support ar 0.87 0.88 0.88 5000 cl 0.97 0.95 0.96 5000 co 0.86 0.83 0.84 5000 cr 1.00 1.00 1.00 5000 cu 0.92 0.94 0.93 5000 ec 0.87 0.88 0.87 5000 es 0.91 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.83 0.83 0.83 5000 mx 0.87 0.85 0.86 5000 ni 0.80 0.68 0.74 4093 pa 0.90 0.92 0.91 5000 pe 0.84 0.84 0.84 5000 py 0.86 0.89 0.88 5000 sv 0.86 0.89 0.88 5000 uy 0.82 0.84 0.83 5000 ve 0.91 0.93 0.92 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 14108) (84093, 14108) (416372, 14080) (84093, 14080) STARTING UNMASKING ROUND 26 spa cc cxg2 26 precision recall f1-score support ar 0.87 0.88 0.88 5000 cl 0.96 0.95 0.96 5000 co 0.86 0.82 0.84 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.93 5000 ec 0.86 0.88 0.87 5000 es 0.91 0.92 0.91 5000 gt 0.88 0.89 0.88 5000 hn 0.83 0.83 0.83 5000 mx 0.87 0.85 0.86 5000 ni 0.80 0.68 0.74 4093 pa 0.90 0.91 0.91 5000 pe 0.84 0.84 0.84 5000 py 0.86 0.89 0.88 5000 sv 0.86 0.89 0.87 5000 uy 0.82 0.84 0.83 5000 ve 0.91 0.92 0.92 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 14080) (84093, 14080) (416372, 14050) (84093, 14050) STARTING UNMASKING ROUND 27 spa cc cxg2 27 precision recall f1-score support ar 0.87 0.88 0.87 5000 cl 0.96 0.95 0.95 5000 co 0.86 0.82 0.84 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.92 5000 ec 0.86 0.88 0.87 5000 es 0.91 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.83 0.83 0.83 5000 mx 0.87 0.84 0.86 5000 ni 0.80 0.68 0.74 4093 pa 0.90 0.91 0.90 5000 pe 0.84 0.84 0.84 5000 py 0.86 0.89 0.88 5000 sv 0.86 0.89 0.87 5000 uy 0.82 0.83 0.83 5000 ve 0.91 0.92 0.92 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 14050) (84093, 14050) (416372, 14019) (84093, 14019) STARTING UNMASKING ROUND 28 spa cc cxg2 28 precision recall f1-score support ar 0.87 0.88 0.87 5000 cl 0.96 0.95 0.95 5000 co 0.86 0.82 0.84 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.93 5000 ec 0.86 0.88 0.87 5000 es 0.91 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.83 0.83 0.83 5000 mx 0.87 0.84 0.85 5000 ni 0.80 0.68 0.73 4093 pa 0.89 0.91 0.90 5000 pe 0.84 0.84 0.84 5000 py 0.86 0.89 0.88 5000 sv 0.85 0.89 0.87 5000 uy 0.82 0.83 0.82 5000 ve 0.91 0.92 0.92 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 14019) (84093, 14019) (416372, 13987) (84093, 13987) STARTING UNMASKING ROUND 29 spa cc cxg2 29 precision recall f1-score support ar 0.87 0.87 0.87 5000 cl 0.96 0.95 0.95 5000 co 0.86 0.81 0.84 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.92 5000 ec 0.86 0.87 0.87 5000 es 0.90 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.82 0.83 0.83 5000 mx 0.87 0.84 0.85 5000 ni 0.80 0.68 0.73 4093 pa 0.89 0.91 0.90 5000 pe 0.84 0.83 0.84 5000 py 0.86 0.89 0.87 5000 sv 0.85 0.89 0.87 5000 uy 0.81 0.83 0.82 5000 ve 0.91 0.92 0.91 5000 avg / total 0.88 0.88 0.88 84093 Reducing feature vectors. (416372, 13987) (84093, 13987) (416372, 13954) (84093, 13954) STARTING UNMASKING ROUND 30 spa cc cxg2 30 precision recall f1-score support ar 0.87 0.87 0.87 5000 cl 0.96 0.95 0.95 5000 co 0.85 0.82 0.83 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.92 5000 ec 0.86 0.87 0.87 5000 es 0.90 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.82 0.83 0.82 5000 mx 0.87 0.83 0.85 5000 ni 0.80 0.68 0.73 4093 pa 0.89 0.91 0.90 5000 pe 0.84 0.83 0.83 5000 py 0.86 0.89 0.87 5000 sv 0.85 0.88 0.87 5000 uy 0.81 0.83 0.82 5000 ve 0.91 0.92 0.91 5000 avg / total 0.87 0.88 0.87 84093 Reducing feature vectors. (416372, 13954) (84093, 13954) (416372, 13921) (84093, 13921) STARTING UNMASKING ROUND 31 spa cc cxg2 31 precision recall f1-score support ar 0.86 0.87 0.87 5000 cl 0.96 0.95 0.95 5000 co 0.85 0.81 0.83 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.94 0.92 5000 ec 0.86 0.87 0.87 5000 es 0.90 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.82 0.82 0.82 5000 mx 0.87 0.83 0.85 5000 ni 0.79 0.67 0.72 4093 pa 0.89 0.91 0.90 5000 pe 0.83 0.83 0.83 5000 py 0.86 0.89 0.87 5000 sv 0.85 0.88 0.87 5000 uy 0.81 0.83 0.82 5000 ve 0.91 0.92 0.91 5000 avg / total 0.87 0.87 0.87 84093 Reducing feature vectors. (416372, 13921) (84093, 13921) (416372, 13888) (84093, 13888) STARTING UNMASKING ROUND 32 spa cc cxg2 32 precision recall f1-score support ar 0.86 0.86 0.86 5000 cl 0.95 0.94 0.95 5000 co 0.85 0.81 0.83 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.86 0.87 0.86 5000 es 0.90 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.82 0.82 0.82 5000 mx 0.86 0.83 0.85 5000 ni 0.79 0.67 0.72 4093 pa 0.89 0.91 0.90 5000 pe 0.83 0.83 0.83 5000 py 0.86 0.88 0.87 5000 sv 0.85 0.88 0.87 5000 uy 0.81 0.82 0.81 5000 ve 0.90 0.92 0.91 5000 avg / total 0.87 0.87 0.87 84093 Reducing feature vectors. (416372, 13888) (84093, 13888) (416372, 13854) (84093, 13854) STARTING UNMASKING ROUND 33 spa cc cxg2 33 precision recall f1-score support ar 0.86 0.86 0.86 5000 cl 0.95 0.94 0.95 5000 co 0.84 0.81 0.82 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.87 0.86 5000 es 0.90 0.92 0.91 5000 gt 0.87 0.89 0.88 5000 hn 0.82 0.82 0.82 5000 mx 0.86 0.83 0.84 5000 ni 0.79 0.67 0.72 4093 pa 0.89 0.90 0.89 5000 pe 0.83 0.83 0.83 5000 py 0.86 0.88 0.87 5000 sv 0.85 0.88 0.86 5000 uy 0.80 0.82 0.81 5000 ve 0.90 0.92 0.91 5000 avg / total 0.87 0.87 0.87 84093 Reducing feature vectors. (416372, 13854) (84093, 13854) (416372, 13820) (84093, 13820) STARTING UNMASKING ROUND 34 spa cc cxg2 34 precision recall f1-score support ar 0.85 0.86 0.86 5000 cl 0.95 0.94 0.95 5000 co 0.84 0.80 0.82 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.87 0.86 5000 es 0.90 0.91 0.90 5000 gt 0.87 0.89 0.88 5000 hn 0.81 0.82 0.82 5000 mx 0.86 0.83 0.84 5000 ni 0.78 0.66 0.72 4093 pa 0.88 0.90 0.89 5000 pe 0.83 0.82 0.82 5000 py 0.85 0.87 0.86 5000 sv 0.85 0.88 0.86 5000 uy 0.80 0.82 0.81 5000 ve 0.90 0.91 0.90 5000 avg / total 0.87 0.87 0.87 84093 Reducing feature vectors. (416372, 13820) (84093, 13820) (416372, 13786) (84093, 13786) STARTING UNMASKING ROUND 35 spa cc cxg2 35 precision recall f1-score support ar 0.86 0.86 0.86 5000 cl 0.95 0.94 0.94 5000 co 0.84 0.79 0.81 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.87 0.86 5000 es 0.90 0.91 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.81 0.81 0.81 5000 mx 0.86 0.83 0.84 5000 ni 0.78 0.65 0.71 4093 pa 0.88 0.90 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.85 0.88 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.80 0.82 0.81 5000 ve 0.89 0.91 0.90 5000 avg / total 0.86 0.87 0.86 84093 Reducing feature vectors. (416372, 13786) (84093, 13786) (416372, 13754) (84093, 13754) STARTING UNMASKING ROUND 36 spa cc cxg2 36 precision recall f1-score support ar 0.85 0.86 0.85 5000 cl 0.95 0.94 0.94 5000 co 0.83 0.79 0.81 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.86 0.86 5000 es 0.89 0.91 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.80 0.81 0.81 5000 mx 0.86 0.83 0.84 5000 ni 0.78 0.65 0.71 4093 pa 0.88 0.90 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.85 0.87 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.79 0.82 0.80 5000 ve 0.89 0.91 0.90 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13754) (84093, 13754) (416372, 13721) (84093, 13721) STARTING UNMASKING ROUND 37 spa cc cxg2 37 precision recall f1-score support ar 0.85 0.86 0.86 5000 cl 0.94 0.94 0.94 5000 co 0.84 0.79 0.81 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.86 0.86 5000 es 0.89 0.91 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.80 0.81 0.81 5000 mx 0.86 0.82 0.84 5000 ni 0.78 0.65 0.71 4093 pa 0.88 0.89 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.85 0.88 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.80 0.82 0.81 5000 ve 0.89 0.91 0.90 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13721) (84093, 13721) (416372, 13691) (84093, 13691) STARTING UNMASKING ROUND 38 spa cc cxg2 38 precision recall f1-score support ar 0.85 0.86 0.85 5000 cl 0.94 0.94 0.94 5000 co 0.83 0.79 0.81 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.86 0.86 5000 es 0.89 0.91 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.80 0.81 0.81 5000 mx 0.85 0.82 0.84 5000 ni 0.78 0.64 0.70 4093 pa 0.88 0.89 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.84 0.87 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.79 0.81 0.80 5000 ve 0.89 0.91 0.90 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13691) (84093, 13691) (416372, 13660) (84093, 13660) STARTING UNMASKING ROUND 39 spa cc cxg2 39 precision recall f1-score support ar 0.85 0.86 0.85 5000 cl 0.95 0.94 0.94 5000 co 0.83 0.79 0.81 5000 cr 1.00 1.00 1.00 5000 cu 0.91 0.93 0.92 5000 ec 0.85 0.86 0.85 5000 es 0.89 0.91 0.90 5000 gt 0.86 0.88 0.87 5000 hn 0.80 0.81 0.80 5000 mx 0.85 0.82 0.84 5000 ni 0.77 0.64 0.70 4093 pa 0.88 0.89 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.84 0.87 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.80 0.81 0.80 5000 ve 0.88 0.91 0.90 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13660) (84093, 13660) (416372, 13628) (84093, 13628) STARTING UNMASKING ROUND 40 spa cc cxg2 40 precision recall f1-score support ar 0.85 0.85 0.85 5000 cl 0.94 0.94 0.94 5000 co 0.83 0.78 0.80 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.93 0.92 5000 ec 0.84 0.86 0.85 5000 es 0.89 0.90 0.90 5000 gt 0.85 0.88 0.87 5000 hn 0.80 0.80 0.80 5000 mx 0.85 0.82 0.83 5000 ni 0.77 0.64 0.70 4093 pa 0.88 0.89 0.89 5000 pe 0.82 0.82 0.82 5000 py 0.84 0.87 0.86 5000 sv 0.84 0.87 0.85 5000 uy 0.79 0.81 0.80 5000 ve 0.88 0.91 0.90 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13628) (84093, 13628) (416372, 13594) (84093, 13594) STARTING UNMASKING ROUND 41 spa cc cxg2 41 precision recall f1-score support ar 0.85 0.85 0.85 5000 cl 0.94 0.94 0.94 5000 co 0.82 0.78 0.80 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.93 0.92 5000 ec 0.84 0.86 0.85 5000 es 0.89 0.90 0.90 5000 gt 0.85 0.87 0.86 5000 hn 0.80 0.80 0.80 5000 mx 0.85 0.81 0.83 5000 ni 0.76 0.64 0.70 4093 pa 0.87 0.89 0.88 5000 pe 0.82 0.82 0.82 5000 py 0.84 0.87 0.85 5000 sv 0.83 0.87 0.85 5000 uy 0.79 0.81 0.80 5000 ve 0.88 0.91 0.89 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13594) (84093, 13594) (416372, 13561) (84093, 13561) STARTING UNMASKING ROUND 42 spa cc cxg2 42 precision recall f1-score support ar 0.84 0.85 0.85 5000 cl 0.94 0.93 0.94 5000 co 0.82 0.78 0.80 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.93 0.92 5000 ec 0.84 0.85 0.85 5000 es 0.89 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.80 0.80 0.80 5000 mx 0.85 0.81 0.83 5000 ni 0.76 0.63 0.69 4093 pa 0.87 0.89 0.88 5000 pe 0.82 0.82 0.82 5000 py 0.84 0.86 0.85 5000 sv 0.83 0.86 0.85 5000 uy 0.79 0.81 0.80 5000 ve 0.88 0.91 0.89 5000 avg / total 0.86 0.86 0.86 84093 Reducing feature vectors. (416372, 13561) (84093, 13561) (416372, 13527) (84093, 13527) STARTING UNMASKING ROUND 43 spa cc cxg2 43 precision recall f1-score support ar 0.84 0.85 0.85 5000 cl 0.94 0.93 0.94 5000 co 0.82 0.78 0.80 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.93 0.92 5000 ec 0.84 0.85 0.85 5000 es 0.89 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.80 0.80 0.80 5000 mx 0.84 0.81 0.83 5000 ni 0.76 0.63 0.69 4093 pa 0.87 0.89 0.88 5000 pe 0.81 0.81 0.81 5000 py 0.84 0.86 0.85 5000 sv 0.83 0.86 0.85 5000 uy 0.78 0.81 0.80 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13527) (84093, 13527) (416372, 13496) (84093, 13496) STARTING UNMASKING ROUND 44 spa cc cxg2 44 precision recall f1-score support ar 0.84 0.85 0.85 5000 cl 0.94 0.93 0.94 5000 co 0.82 0.78 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.93 0.91 5000 ec 0.84 0.85 0.84 5000 es 0.89 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.79 0.80 0.80 5000 mx 0.85 0.81 0.83 5000 ni 0.76 0.63 0.69 4093 pa 0.87 0.88 0.87 5000 pe 0.81 0.81 0.81 5000 py 0.84 0.86 0.85 5000 sv 0.83 0.86 0.84 5000 uy 0.79 0.81 0.80 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13496) (84093, 13496) (416372, 13463) (84093, 13463) STARTING UNMASKING ROUND 45 spa cc cxg2 45 precision recall f1-score support ar 0.84 0.85 0.84 5000 cl 0.94 0.93 0.94 5000 co 0.82 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.84 0.84 0.84 5000 es 0.88 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.80 0.80 0.80 5000 mx 0.84 0.81 0.83 5000 ni 0.76 0.63 0.69 4093 pa 0.87 0.88 0.87 5000 pe 0.80 0.81 0.81 5000 py 0.83 0.86 0.85 5000 sv 0.82 0.86 0.84 5000 uy 0.78 0.80 0.79 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13463) (84093, 13463) (416372, 13429) (84093, 13429) STARTING UNMASKING ROUND 46 spa cc cxg2 46 precision recall f1-score support ar 0.84 0.85 0.84 5000 cl 0.94 0.93 0.93 5000 co 0.82 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.84 0.84 0.84 5000 es 0.88 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.79 0.79 0.79 5000 mx 0.84 0.81 0.83 5000 ni 0.76 0.63 0.69 4093 pa 0.86 0.88 0.87 5000 pe 0.80 0.81 0.80 5000 py 0.83 0.86 0.85 5000 sv 0.82 0.86 0.84 5000 uy 0.78 0.80 0.79 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13429) (84093, 13429) (416372, 13396) (84093, 13396) STARTING UNMASKING ROUND 47 spa cc cxg2 47 precision recall f1-score support ar 0.83 0.84 0.84 5000 cl 0.94 0.93 0.93 5000 co 0.81 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.84 5000 es 0.88 0.90 0.89 5000 gt 0.85 0.87 0.86 5000 hn 0.79 0.79 0.79 5000 mx 0.84 0.81 0.82 5000 ni 0.76 0.62 0.69 4093 pa 0.86 0.87 0.87 5000 pe 0.80 0.80 0.80 5000 py 0.83 0.86 0.85 5000 sv 0.82 0.85 0.84 5000 uy 0.78 0.80 0.79 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13396) (84093, 13396) (416372, 13363) (84093, 13363) STARTING UNMASKING ROUND 48 spa cc cxg2 48 precision recall f1-score support ar 0.83 0.84 0.84 5000 cl 0.94 0.93 0.93 5000 co 0.82 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.84 5000 es 0.88 0.90 0.89 5000 gt 0.84 0.87 0.86 5000 hn 0.78 0.79 0.79 5000 mx 0.84 0.81 0.82 5000 ni 0.76 0.62 0.68 4093 pa 0.86 0.87 0.87 5000 pe 0.80 0.80 0.80 5000 py 0.83 0.86 0.84 5000 sv 0.82 0.85 0.84 5000 uy 0.78 0.80 0.79 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13363) (84093, 13363) (416372, 13330) (84093, 13330) STARTING UNMASKING ROUND 49 spa cc cxg2 49 precision recall f1-score support ar 0.83 0.84 0.84 5000 cl 0.93 0.93 0.93 5000 co 0.81 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.84 5000 es 0.88 0.90 0.89 5000 gt 0.84 0.87 0.86 5000 hn 0.78 0.79 0.79 5000 mx 0.84 0.80 0.82 5000 ni 0.76 0.62 0.68 4093 pa 0.86 0.87 0.87 5000 pe 0.79 0.80 0.80 5000 py 0.83 0.86 0.84 5000 sv 0.82 0.85 0.83 5000 uy 0.78 0.80 0.79 5000 ve 0.88 0.90 0.89 5000 avg / total 0.85 0.85 0.85 84093 Reducing feature vectors. (416372, 13330) (84093, 13330) (416372, 13297) (84093, 13297) STARTING UNMASKING ROUND 50 spa cc cxg2 50 precision recall f1-score support ar 0.83 0.84 0.83 5000 cl 0.94 0.93 0.93 5000 co 0.81 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.83 5000 es 0.88 0.90 0.89 5000 gt 0.84 0.87 0.86 5000 hn 0.79 0.79 0.79 5000 mx 0.84 0.80 0.82 5000 ni 0.76 0.61 0.68 4093 pa 0.86 0.87 0.87 5000 pe 0.79 0.80 0.80 5000 py 0.83 0.86 0.84 5000 sv 0.82 0.85 0.83 5000 uy 0.78 0.79 0.78 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.85 0.84 84093 Reducing feature vectors. (416372, 13297) (84093, 13297) (416372, 13263) (84093, 13263) STARTING UNMASKING ROUND 51 spa cc cxg2 51 precision recall f1-score support ar 0.82 0.84 0.83 5000 cl 0.93 0.93 0.93 5000 co 0.81 0.77 0.79 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.83 5000 es 0.88 0.90 0.89 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.83 0.80 0.82 5000 ni 0.76 0.62 0.68 4093 pa 0.86 0.87 0.86 5000 pe 0.79 0.80 0.80 5000 py 0.82 0.85 0.84 5000 sv 0.82 0.85 0.83 5000 uy 0.77 0.79 0.78 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13263) (84093, 13263) (416372, 13230) (84093, 13230) STARTING UNMASKING ROUND 52 spa cc cxg2 52 precision recall f1-score support ar 0.82 0.84 0.83 5000 cl 0.93 0.92 0.93 5000 co 0.81 0.76 0.78 5000 cr 1.00 1.00 1.00 5000 cu 0.90 0.92 0.91 5000 ec 0.83 0.84 0.83 5000 es 0.88 0.89 0.89 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.83 0.80 0.82 5000 ni 0.75 0.61 0.68 4093 pa 0.86 0.87 0.86 5000 pe 0.79 0.80 0.79 5000 py 0.82 0.85 0.84 5000 sv 0.82 0.85 0.83 5000 uy 0.77 0.79 0.78 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13230) (84093, 13230) (416372, 13197) (84093, 13197) STARTING UNMASKING ROUND 53 spa cc cxg2 53 precision recall f1-score support ar 0.82 0.84 0.83 5000 cl 0.93 0.92 0.93 5000 co 0.80 0.76 0.78 5000 cr 1.00 1.00 1.00 5000 cu 0.89 0.92 0.91 5000 ec 0.83 0.83 0.83 5000 es 0.88 0.89 0.89 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.83 0.80 0.81 5000 ni 0.76 0.61 0.67 4093 pa 0.86 0.87 0.86 5000 pe 0.79 0.80 0.79 5000 py 0.82 0.85 0.84 5000 sv 0.81 0.85 0.83 5000 uy 0.77 0.79 0.78 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13197) (84093, 13197) (416372, 13163) (84093, 13163) STARTING UNMASKING ROUND 54 spa cc cxg2 54 precision recall f1-score support ar 0.82 0.83 0.83 5000 cl 0.93 0.92 0.93 5000 co 0.80 0.75 0.78 5000 cr 1.00 1.00 1.00 5000 cu 0.89 0.92 0.90 5000 ec 0.82 0.83 0.83 5000 es 0.88 0.89 0.88 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.83 0.80 0.81 5000 ni 0.76 0.61 0.67 4093 pa 0.86 0.87 0.86 5000 pe 0.79 0.80 0.79 5000 py 0.82 0.85 0.83 5000 sv 0.81 0.84 0.83 5000 uy 0.76 0.79 0.77 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13163) (84093, 13163) (416372, 13131) (84093, 13131) STARTING UNMASKING ROUND 55 spa cc cxg2 55 precision recall f1-score support ar 0.82 0.83 0.83 5000 cl 0.93 0.92 0.93 5000 co 0.80 0.75 0.77 5000 cr 0.99 1.00 1.00 5000 cu 0.89 0.92 0.90 5000 ec 0.82 0.83 0.83 5000 es 0.88 0.89 0.88 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.83 0.79 0.81 5000 ni 0.75 0.60 0.67 4093 pa 0.85 0.87 0.86 5000 pe 0.79 0.79 0.79 5000 py 0.82 0.85 0.83 5000 sv 0.81 0.84 0.83 5000 uy 0.76 0.79 0.77 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13131) (84093, 13131) (416372, 13097) (84093, 13097) STARTING UNMASKING ROUND 56 spa cc cxg2 56 precision recall f1-score support ar 0.82 0.83 0.83 5000 cl 0.93 0.92 0.92 5000 co 0.80 0.75 0.77 5000 cr 0.99 1.00 1.00 5000 cu 0.89 0.91 0.90 5000 ec 0.82 0.83 0.83 5000 es 0.87 0.89 0.88 5000 gt 0.84 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.82 0.79 0.81 5000 ni 0.75 0.60 0.67 4093 pa 0.85 0.87 0.86 5000 pe 0.79 0.79 0.79 5000 py 0.82 0.85 0.83 5000 sv 0.80 0.84 0.82 5000 uy 0.76 0.78 0.77 5000 ve 0.87 0.89 0.88 5000 avg / total 0.84 0.84 0.84 84093 Reducing feature vectors. (416372, 13097) (84093, 13097) (416372, 13063) (84093, 13063) STARTING UNMASKING ROUND 57 spa cc cxg2 57 precision recall f1-score support ar 0.81 0.83 0.82 5000 cl 0.93 0.92 0.93 5000 co 0.79 0.74 0.77 5000 cr 1.00 1.00 1.00 5000 cu 0.89 0.91 0.90 5000 ec 0.82 0.83 0.83 5000 es 0.87 0.89 0.88 5000 gt 0.83 0.87 0.85 5000 hn 0.78 0.78 0.78 5000 mx 0.82 0.79 0.81 5000 ni 0.75 0.60 0.67 4093 pa 0.84 0.86 0.85 5000 pe 0.79 0.79 0.79 5000 py 0.81 0.84 0.83 5000 sv 0.80 0.84 0.82 5000 uy 0.76 0.78 0.77 5000 ve 0.87 0.88 0.87 5000 avg / total 0.83 0.84 0.83 84093 Reducing feature vectors. (416372, 13063) (84093, 13063) (416372, 13030) (84093, 13030) STARTING UNMASKING ROUND 58 spa cc cxg2 58 precision recall f1-score support ar 0.81 0.83 0.82 5000 cl 0.93 0.92 0.92 5000 co 0.79 0.74 0.77 5000 cr 1.00 1.00 1.00 5000 cu 0.88 0.91 0.90 5000 ec 0.82 0.83 0.82 5000 es 0.87 0.88 0.88 5000 gt 0.83 0.87 0.85 5000 hn 0.77 0.78 0.78 5000 mx 0.82 0.79 0.81 5000 ni 0.75 0.60 0.67 4093 pa 0.84 0.86 0.85 5000 pe 0.79 0.79 0.79 5000 py 0.81 0.84 0.83 5000 sv 0.80 0.84 0.82 5000 uy 0.76 0.78 0.77 5000 ve 0.86 0.88 0.87 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 13030) (84093, 13030) (416372, 12998) (84093, 12998) STARTING UNMASKING ROUND 59 spa cc cxg2 59 precision recall f1-score support ar 0.81 0.83 0.82 5000 cl 0.93 0.92 0.92 5000 co 0.79 0.74 0.76 5000 cr 1.00 1.00 1.00 5000 cu 0.88 0.91 0.90 5000 ec 0.82 0.83 0.82 5000 es 0.87 0.88 0.88 5000 gt 0.82 0.86 0.84 5000 hn 0.77 0.77 0.77 5000 mx 0.83 0.79 0.81 5000 ni 0.75 0.60 0.67 4093 pa 0.84 0.86 0.85 5000 pe 0.78 0.79 0.79 5000 py 0.81 0.84 0.82 5000 sv 0.80 0.84 0.82 5000 uy 0.76 0.77 0.77 5000 ve 0.86 0.88 0.87 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 12998) (84093, 12998) (416372, 12966) (84093, 12966) STARTING UNMASKING ROUND 60 spa cc cxg2 60 precision recall f1-score support ar 0.82 0.83 0.82 5000 cl 0.92 0.92 0.92 5000 co 0.79 0.74 0.76 5000 cr 1.00 1.00 1.00 5000 cu 0.88 0.91 0.89 5000 ec 0.82 0.82 0.82 5000 es 0.87 0.88 0.88 5000 gt 0.83 0.86 0.85 5000 hn 0.77 0.77 0.77 5000 mx 0.82 0.79 0.80 5000 ni 0.75 0.60 0.66 4093 pa 0.84 0.86 0.85 5000 pe 0.78 0.79 0.78 5000 py 0.81 0.84 0.82 5000 sv 0.80 0.83 0.82 5000 uy 0.76 0.77 0.77 5000 ve 0.86 0.88 0.87 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 12966) (84093, 12966) (416372, 12933) (84093, 12933) STARTING UNMASKING ROUND 61 spa cc cxg2 61 precision recall f1-score support ar 0.81 0.83 0.82 5000 cl 0.93 0.92 0.92 5000 co 0.79 0.73 0.76 5000 cr 0.99 1.00 1.00 5000 cu 0.88 0.91 0.89 5000 ec 0.81 0.82 0.82 5000 es 0.87 0.88 0.87 5000 gt 0.83 0.86 0.84 5000 hn 0.77 0.77 0.77 5000 mx 0.82 0.79 0.80 5000 ni 0.75 0.60 0.66 4093 pa 0.84 0.86 0.85 5000 pe 0.78 0.79 0.78 5000 py 0.81 0.84 0.82 5000 sv 0.80 0.83 0.82 5000 uy 0.76 0.77 0.76 5000 ve 0.86 0.88 0.87 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 12933) (84093, 12933) (416372, 12902) (84093, 12902) STARTING UNMASKING ROUND 62 spa cc cxg2 62 precision recall f1-score support ar 0.81 0.83 0.82 5000 cl 0.92 0.92 0.92 5000 co 0.79 0.74 0.76 5000 cr 1.00 1.00 1.00 5000 cu 0.88 0.91 0.89 5000 ec 0.81 0.82 0.82 5000 es 0.87 0.88 0.87 5000 gt 0.83 0.86 0.84 5000 hn 0.77 0.77 0.77 5000 mx 0.82 0.78 0.80 5000 ni 0.74 0.60 0.66 4093 pa 0.83 0.86 0.84 5000 pe 0.78 0.78 0.78 5000 py 0.81 0.84 0.82 5000 sv 0.80 0.83 0.82 5000 uy 0.76 0.77 0.77 5000 ve 0.85 0.87 0.86 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 12902) (84093, 12902) (416372, 12870) (84093, 12870) STARTING UNMASKING ROUND 63 spa cc cxg2 63 precision recall f1-score support ar 0.81 0.82 0.82 5000 cl 0.92 0.91 0.92 5000 co 0.78 0.73 0.75 5000 cr 0.99 1.00 1.00 5000 cu 0.88 0.91 0.89 5000 ec 0.80 0.82 0.81 5000 es 0.87 0.88 0.88 5000 gt 0.83 0.86 0.84 5000 hn 0.77 0.77 0.77 5000 mx 0.81 0.78 0.79 5000 ni 0.74 0.59 0.66 4093 pa 0.83 0.85 0.84 5000 pe 0.77 0.78 0.78 5000 py 0.80 0.84 0.82 5000 sv 0.80 0.83 0.81 5000 uy 0.76 0.77 0.76 5000 ve 0.86 0.87 0.86 5000 avg / total 0.83 0.83 0.83 84093 Reducing feature vectors. (416372, 12870) (84093, 12870) (416372, 12837) (84093, 12837) STARTING UNMASKING ROUND 64 spa cc cxg2 64 precision recall f1-score support ar 0.81 0.82 0.82 5000 cl 0.92 0.91 0.91 5000 co 0.78 0.72 0.75 5000 cr 0.99 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.80 0.81 0.80 5000 es 0.87 0.89 0.88 5000 gt 0.82 0.86 0.84 5000 hn 0.76 0.77 0.76 5000 mx 0.80 0.75 0.78 5000 ni 0.73 0.58 0.65 4093 pa 0.83 0.84 0.84 5000 pe 0.77 0.78 0.77 5000 py 0.80 0.83 0.81 5000 sv 0.79 0.83 0.81 5000 uy 0.75 0.76 0.76 5000 ve 0.85 0.87 0.86 5000 avg / total 0.82 0.82 0.82 84093 Reducing feature vectors. (416372, 12837) (84093, 12837) (416372, 12804) (84093, 12804) STARTING UNMASKING ROUND 65 spa cc cxg2 65 precision recall f1-score support ar 0.81 0.82 0.81 5000 cl 0.92 0.91 0.91 5000 co 0.77 0.72 0.75 5000 cr 0.99 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.80 0.81 0.80 5000 es 0.86 0.88 0.87 5000 gt 0.82 0.86 0.84 5000 hn 0.76 0.77 0.76 5000 mx 0.80 0.75 0.78 5000 ni 0.73 0.58 0.65 4093 pa 0.83 0.84 0.83 5000 pe 0.77 0.78 0.77 5000 py 0.79 0.83 0.81 5000 sv 0.79 0.83 0.81 5000 uy 0.75 0.76 0.75 5000 ve 0.85 0.87 0.86 5000 avg / total 0.82 0.82 0.82 84093 Reducing feature vectors. (416372, 12804) (84093, 12804) (416372, 12771) (84093, 12771) STARTING UNMASKING ROUND 66 spa cc cxg2 66 precision recall f1-score support ar 0.80 0.82 0.81 5000 cl 0.92 0.91 0.91 5000 co 0.77 0.72 0.74 5000 cr 0.99 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.80 0.81 0.80 5000 es 0.86 0.88 0.87 5000 gt 0.82 0.85 0.84 5000 hn 0.76 0.76 0.76 5000 mx 0.80 0.75 0.77 5000 ni 0.73 0.58 0.65 4093 pa 0.82 0.84 0.83 5000 pe 0.76 0.77 0.77 5000 py 0.79 0.83 0.81 5000 sv 0.79 0.82 0.81 5000 uy 0.75 0.76 0.75 5000 ve 0.85 0.87 0.86 5000 avg / total 0.82 0.82 0.82 84093 Reducing feature vectors. (416372, 12771) (84093, 12771) (416372, 12738) (84093, 12738) STARTING UNMASKING ROUND 67 spa cc cxg2 67 precision recall f1-score support ar 0.80 0.82 0.81 5000 cl 0.92 0.90 0.91 5000 co 0.77 0.72 0.74 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.80 0.81 0.80 5000 es 0.86 0.88 0.87 5000 gt 0.82 0.85 0.84 5000 hn 0.76 0.76 0.76 5000 mx 0.80 0.75 0.77 5000 ni 0.73 0.58 0.64 4093 pa 0.82 0.84 0.83 5000 pe 0.76 0.77 0.77 5000 py 0.79 0.83 0.81 5000 sv 0.79 0.82 0.80 5000 uy 0.75 0.76 0.75 5000 ve 0.85 0.87 0.86 5000 avg / total 0.82 0.82 0.82 84093 Reducing feature vectors. (416372, 12738) (84093, 12738) (416372, 12704) (84093, 12704) STARTING UNMASKING ROUND 68 spa cc cxg2 68 precision recall f1-score support ar 0.80 0.82 0.81 5000 cl 0.92 0.90 0.91 5000 co 0.77 0.71 0.74 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.80 5000 es 0.86 0.88 0.87 5000 gt 0.82 0.85 0.84 5000 hn 0.75 0.76 0.76 5000 mx 0.80 0.75 0.77 5000 ni 0.73 0.57 0.64 4093 pa 0.82 0.84 0.83 5000 pe 0.76 0.77 0.76 5000 py 0.79 0.83 0.81 5000 sv 0.79 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.85 0.87 0.86 5000 avg / total 0.82 0.82 0.82 84093 Reducing feature vectors. (416372, 12704) (84093, 12704) (416372, 12671) (84093, 12671) STARTING UNMASKING ROUND 69 spa cc cxg2 69 precision recall f1-score support ar 0.80 0.82 0.81 5000 cl 0.92 0.90 0.91 5000 co 0.77 0.71 0.74 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.80 5000 es 0.86 0.88 0.87 5000 gt 0.82 0.85 0.84 5000 hn 0.75 0.76 0.76 5000 mx 0.80 0.75 0.77 5000 ni 0.73 0.57 0.64 4093 pa 0.82 0.83 0.83 5000 pe 0.76 0.77 0.76 5000 py 0.79 0.83 0.81 5000 sv 0.78 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.85 0.87 0.86 5000 avg / total 0.81 0.82 0.81 84093 Reducing feature vectors. (416372, 12671) (84093, 12671) (416372, 12637) (84093, 12637) STARTING UNMASKING ROUND 70 spa cc cxg2 70 precision recall f1-score support ar 0.80 0.81 0.81 5000 cl 0.92 0.90 0.91 5000 co 0.76 0.71 0.74 5000 cr 0.99 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.80 5000 es 0.85 0.88 0.87 5000 gt 0.82 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.75 0.77 5000 ni 0.72 0.57 0.64 4093 pa 0.82 0.83 0.83 5000 pe 0.76 0.77 0.76 5000 py 0.79 0.83 0.81 5000 sv 0.79 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.84 0.87 0.86 5000 avg / total 0.81 0.82 0.81 84093 Reducing feature vectors. (416372, 12637) (84093, 12637) (416372, 12604) (84093, 12604) STARTING UNMASKING ROUND 71 spa cc cxg2 71 precision recall f1-score support ar 0.80 0.81 0.81 5000 cl 0.92 0.90 0.91 5000 co 0.76 0.71 0.74 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.80 5000 es 0.85 0.88 0.86 5000 gt 0.82 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.80 0.75 0.77 5000 ni 0.73 0.57 0.64 4093 pa 0.82 0.83 0.83 5000 pe 0.76 0.77 0.76 5000 py 0.79 0.83 0.81 5000 sv 0.78 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.84 0.87 0.86 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12604) (84093, 12604) (416372, 12570) (84093, 12570) STARTING UNMASKING ROUND 72 spa cc cxg2 72 precision recall f1-score support ar 0.80 0.81 0.81 5000 cl 0.91 0.90 0.91 5000 co 0.76 0.71 0.73 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.82 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.75 0.77 5000 ni 0.72 0.57 0.64 4093 pa 0.82 0.83 0.83 5000 pe 0.75 0.77 0.76 5000 py 0.79 0.82 0.80 5000 sv 0.79 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12570) (84093, 12570) (416372, 12536) (84093, 12536) STARTING UNMASKING ROUND 73 spa cc cxg2 73 precision recall f1-score support ar 0.80 0.81 0.80 5000 cl 0.91 0.90 0.91 5000 co 0.76 0.70 0.73 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.91 0.89 5000 ec 0.79 0.80 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.82 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.74 0.77 5000 ni 0.72 0.57 0.64 4093 pa 0.82 0.83 0.82 5000 pe 0.75 0.76 0.76 5000 py 0.79 0.82 0.80 5000 sv 0.78 0.82 0.80 5000 uy 0.74 0.75 0.74 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12536) (84093, 12536) (416372, 12502) (84093, 12502) STARTING UNMASKING ROUND 74 spa cc cxg2 74 precision recall f1-score support ar 0.79 0.81 0.80 5000 cl 0.91 0.90 0.91 5000 co 0.76 0.70 0.73 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.90 0.89 5000 ec 0.79 0.80 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.81 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.74 0.77 5000 ni 0.72 0.57 0.63 4093 pa 0.81 0.83 0.82 5000 pe 0.75 0.76 0.76 5000 py 0.79 0.82 0.80 5000 sv 0.78 0.82 0.80 5000 uy 0.74 0.75 0.75 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12502) (84093, 12502) (416372, 12471) (84093, 12471) STARTING UNMASKING ROUND 75 spa cc cxg2 75 precision recall f1-score support ar 0.79 0.81 0.80 5000 cl 0.91 0.90 0.90 5000 co 0.75 0.70 0.72 5000 cr 1.00 1.00 1.00 5000 cu 0.87 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.84 0.87 0.86 5000 gt 0.81 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.74 0.76 5000 ni 0.72 0.57 0.64 4093 pa 0.81 0.83 0.82 5000 pe 0.75 0.77 0.76 5000 py 0.78 0.82 0.80 5000 sv 0.78 0.81 0.79 5000 uy 0.74 0.75 0.74 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12471) (84093, 12471) (416372, 12438) (84093, 12438) STARTING UNMASKING ROUND 76 spa cc cxg2 76 precision recall f1-score support ar 0.79 0.81 0.80 5000 cl 0.91 0.90 0.90 5000 co 0.75 0.70 0.72 5000 cr 1.00 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.81 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.74 0.76 5000 ni 0.73 0.57 0.64 4093 pa 0.81 0.83 0.82 5000 pe 0.75 0.76 0.76 5000 py 0.79 0.82 0.80 5000 sv 0.78 0.81 0.80 5000 uy 0.74 0.75 0.74 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12438) (84093, 12438) (416372, 12404) (84093, 12404) STARTING UNMASKING ROUND 77 spa cc cxg2 77 precision recall f1-score support ar 0.79 0.81 0.80 5000 cl 0.91 0.89 0.90 5000 co 0.75 0.70 0.72 5000 cr 1.00 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.81 0.85 0.83 5000 hn 0.75 0.76 0.76 5000 mx 0.79 0.74 0.76 5000 ni 0.73 0.56 0.64 4093 pa 0.81 0.83 0.82 5000 pe 0.75 0.76 0.76 5000 py 0.78 0.82 0.80 5000 sv 0.78 0.81 0.80 5000 uy 0.74 0.75 0.74 5000 ve 0.84 0.86 0.85 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12404) (84093, 12404) (416372, 12374) (84093, 12374) STARTING UNMASKING ROUND 78 spa cc cxg2 78 precision recall f1-score support ar 0.79 0.81 0.80 5000 cl 0.91 0.89 0.90 5000 co 0.75 0.70 0.72 5000 cr 1.00 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.81 0.85 0.83 5000 hn 0.75 0.75 0.75 5000 mx 0.79 0.74 0.76 5000 ni 0.72 0.56 0.63 4093 pa 0.81 0.83 0.82 5000 pe 0.75 0.75 0.75 5000 py 0.78 0.81 0.80 5000 sv 0.77 0.81 0.79 5000 uy 0.74 0.75 0.74 5000 ve 0.83 0.86 0.84 5000 avg / total 0.81 0.81 0.81 84093 Reducing feature vectors. (416372, 12374) (84093, 12374) (416372, 12340) (84093, 12340) STARTING UNMASKING ROUND 79 spa cc cxg2 79 precision recall f1-score support ar 0.79 0.80 0.80 5000 cl 0.90 0.89 0.89 5000 co 0.75 0.69 0.72 5000 cr 1.00 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.85 0.87 0.86 5000 gt 0.81 0.84 0.83 5000 hn 0.75 0.75 0.75 5000 mx 0.77 0.72 0.74 5000 ni 0.72 0.56 0.63 4093 pa 0.79 0.82 0.80 5000 pe 0.74 0.75 0.75 5000 py 0.78 0.81 0.79 5000 sv 0.77 0.81 0.79 5000 uy 0.73 0.75 0.74 5000 ve 0.82 0.85 0.84 5000 avg / total 0.80 0.80 0.80 84093 Reducing feature vectors. (416372, 12340) (84093, 12340) (416372, 12309) (84093, 12309) STARTING UNMASKING ROUND 80 spa cc cxg2 80 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.74 0.68 0.71 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.84 0.87 0.85 5000 gt 0.81 0.84 0.83 5000 hn 0.74 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.71 0.55 0.62 4093 pa 0.79 0.81 0.80 5000 pe 0.74 0.75 0.75 5000 py 0.77 0.81 0.79 5000 sv 0.77 0.80 0.79 5000 uy 0.73 0.75 0.74 5000 ve 0.82 0.85 0.83 5000 avg / total 0.80 0.80 0.80 84093 Reducing feature vectors. (416372, 12309) (84093, 12309) (416372, 12275) (84093, 12275) STARTING UNMASKING ROUND 81 spa cc cxg2 81 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.74 0.68 0.71 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.79 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.74 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.72 0.55 0.62 4093 pa 0.79 0.82 0.80 5000 pe 0.74 0.75 0.75 5000 py 0.77 0.81 0.79 5000 sv 0.77 0.80 0.79 5000 uy 0.73 0.75 0.74 5000 ve 0.82 0.85 0.83 5000 avg / total 0.80 0.80 0.80 84093 Reducing feature vectors. (416372, 12275) (84093, 12275) (416372, 12241) (84093, 12241) STARTING UNMASKING ROUND 82 spa cc cxg2 82 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.74 0.69 0.71 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.78 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.71 0.55 0.62 4093 pa 0.78 0.81 0.80 5000 pe 0.75 0.75 0.75 5000 py 0.78 0.81 0.79 5000 sv 0.77 0.80 0.78 5000 uy 0.73 0.75 0.74 5000 ve 0.82 0.85 0.83 5000 avg / total 0.80 0.80 0.80 84093 Reducing feature vectors. (416372, 12241) (84093, 12241) (416372, 12208) (84093, 12208) STARTING UNMASKING ROUND 83 spa cc cxg2 83 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.74 0.68 0.71 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.78 0.79 0.78 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.71 0.54 0.62 4093 pa 0.79 0.81 0.80 5000 pe 0.74 0.75 0.75 5000 py 0.77 0.81 0.79 5000 sv 0.77 0.80 0.78 5000 uy 0.73 0.74 0.74 5000 ve 0.82 0.85 0.83 5000 avg / total 0.80 0.80 0.80 84093 Reducing feature vectors. (416372, 12208) (84093, 12208) (416372, 12174) (84093, 12174) STARTING UNMASKING ROUND 84 spa cc cxg2 84 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.73 0.68 0.71 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.77 0.79 0.78 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.71 0.54 0.62 4093 pa 0.79 0.81 0.80 5000 pe 0.74 0.74 0.74 5000 py 0.77 0.81 0.79 5000 sv 0.77 0.80 0.78 5000 uy 0.73 0.74 0.73 5000 ve 0.81 0.85 0.83 5000 avg / total 0.80 0.80 0.79 84093 Reducing feature vectors. (416372, 12174) (84093, 12174) (416372, 12140) (84093, 12140) STARTING UNMASKING ROUND 85 spa cc cxg2 85 precision recall f1-score support ar 0.78 0.79 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.73 0.68 0.70 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.77 0.78 0.78 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.75 0.74 5000 mx 0.76 0.72 0.74 5000 ni 0.71 0.54 0.62 4093 pa 0.79 0.81 0.80 5000 pe 0.75 0.74 0.74 5000 py 0.77 0.80 0.79 5000 sv 0.76 0.80 0.78 5000 uy 0.73 0.74 0.73 5000 ve 0.81 0.85 0.83 5000 avg / total 0.79 0.80 0.79 84093 Reducing feature vectors. (416372, 12140) (84093, 12140) (416372, 12107) (84093, 12107) STARTING UNMASKING ROUND 86 spa cc cxg2 86 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.73 0.68 0.70 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.89 0.88 5000 ec 0.77 0.78 0.78 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.75 0.74 5000 mx 0.76 0.71 0.74 5000 ni 0.71 0.54 0.61 4093 pa 0.79 0.81 0.80 5000 pe 0.74 0.74 0.74 5000 py 0.77 0.80 0.79 5000 sv 0.76 0.80 0.78 5000 uy 0.72 0.74 0.73 5000 ve 0.81 0.85 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 12107) (84093, 12107) (416372, 12073) (84093, 12073) STARTING UNMASKING ROUND 87 spa cc cxg2 87 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.88 0.89 5000 co 0.73 0.67 0.70 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.90 0.88 5000 ec 0.77 0.78 0.77 5000 es 0.84 0.86 0.85 5000 gt 0.81 0.84 0.82 5000 hn 0.73 0.74 0.74 5000 mx 0.76 0.71 0.73 5000 ni 0.70 0.53 0.61 4093 pa 0.78 0.80 0.79 5000 pe 0.74 0.74 0.74 5000 py 0.77 0.80 0.79 5000 sv 0.76 0.80 0.78 5000 uy 0.73 0.74 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 12073) (84093, 12073) (416372, 12040) (84093, 12040) STARTING UNMASKING ROUND 88 spa cc cxg2 88 precision recall f1-score support ar 0.78 0.80 0.79 5000 cl 0.90 0.87 0.89 5000 co 0.73 0.68 0.70 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.89 0.88 5000 ec 0.77 0.78 0.77 5000 es 0.83 0.86 0.85 5000 gt 0.80 0.84 0.82 5000 hn 0.73 0.74 0.74 5000 mx 0.76 0.71 0.73 5000 ni 0.71 0.53 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.74 0.74 0.74 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.80 0.78 5000 uy 0.72 0.74 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 12040) (84093, 12040) (416372, 12006) (84093, 12006) STARTING UNMASKING ROUND 89 spa cc cxg2 89 precision recall f1-score support ar 0.77 0.80 0.78 5000 cl 0.89 0.88 0.88 5000 co 0.73 0.67 0.70 5000 cr 0.99 1.00 1.00 5000 cu 0.86 0.89 0.87 5000 ec 0.77 0.78 0.77 5000 es 0.83 0.86 0.85 5000 gt 0.80 0.84 0.82 5000 hn 0.73 0.74 0.74 5000 mx 0.76 0.71 0.73 5000 ni 0.70 0.53 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.74 0.73 0.74 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.78 5000 uy 0.72 0.74 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 12006) (84093, 12006) (416372, 11972) (84093, 11972) STARTING UNMASKING ROUND 90 spa cc cxg2 90 precision recall f1-score support ar 0.77 0.79 0.78 5000 cl 0.89 0.88 0.89 5000 co 0.73 0.67 0.70 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.77 0.78 0.77 5000 es 0.83 0.86 0.84 5000 gt 0.80 0.84 0.82 5000 hn 0.73 0.74 0.73 5000 mx 0.76 0.71 0.73 5000 ni 0.70 0.53 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.78 5000 uy 0.72 0.74 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 11972) (84093, 11972) (416372, 11940) (84093, 11940) STARTING UNMASKING ROUND 91 spa cc cxg2 91 precision recall f1-score support ar 0.77 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.67 0.69 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.77 0.78 0.77 5000 es 0.83 0.85 0.84 5000 gt 0.80 0.84 0.82 5000 hn 0.73 0.74 0.74 5000 mx 0.75 0.71 0.73 5000 ni 0.70 0.52 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.77 5000 uy 0.72 0.73 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 11940) (84093, 11940) (416372, 11906) (84093, 11906) STARTING UNMASKING ROUND 92 spa cc cxg2 92 precision recall f1-score support ar 0.77 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.67 0.69 5000 cr 0.99 1.00 1.00 5000 cu 0.85 0.89 0.87 5000 ec 0.77 0.77 0.77 5000 es 0.83 0.86 0.84 5000 gt 0.80 0.84 0.82 5000 hn 0.72 0.74 0.73 5000 mx 0.75 0.70 0.73 5000 ni 0.70 0.52 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.77 5000 uy 0.72 0.73 0.73 5000 ve 0.81 0.84 0.83 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 11906) (84093, 11906) (416372, 11873) (84093, 11873) STARTING UNMASKING ROUND 93 spa cc cxg2 93 precision recall f1-score support ar 0.77 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.67 0.69 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.77 0.77 0.77 5000 es 0.83 0.85 0.84 5000 gt 0.80 0.84 0.82 5000 hn 0.72 0.74 0.73 5000 mx 0.75 0.70 0.73 5000 ni 0.70 0.52 0.60 4093 pa 0.78 0.80 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.77 5000 uy 0.72 0.73 0.72 5000 ve 0.81 0.84 0.82 5000 avg / total 0.79 0.79 0.79 84093 Reducing feature vectors. (416372, 11873) (84093, 11873) (416372, 11839) (84093, 11839) STARTING UNMASKING ROUND 94 spa cc cxg2 94 precision recall f1-score support ar 0.77 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.67 0.69 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.76 0.77 0.77 5000 es 0.83 0.85 0.84 5000 gt 0.80 0.84 0.82 5000 hn 0.72 0.74 0.73 5000 mx 0.75 0.70 0.73 5000 ni 0.69 0.52 0.60 4093 pa 0.78 0.79 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.76 0.79 0.77 5000 uy 0.72 0.73 0.72 5000 ve 0.81 0.84 0.82 5000 avg / total 0.78 0.79 0.78 84093 Reducing feature vectors. (416372, 11839) (84093, 11839) (416372, 11806) (84093, 11806) STARTING UNMASKING ROUND 95 spa cc cxg2 95 precision recall f1-score support ar 0.76 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.66 0.69 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.76 0.77 0.76 5000 es 0.83 0.85 0.84 5000 gt 0.80 0.83 0.82 5000 hn 0.72 0.74 0.73 5000 mx 0.75 0.70 0.72 5000 ni 0.69 0.52 0.59 4093 pa 0.78 0.79 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.75 0.78 0.76 5000 uy 0.71 0.73 0.72 5000 ve 0.80 0.84 0.82 5000 avg / total 0.78 0.78 0.78 84093 Reducing feature vectors. (416372, 11806) (84093, 11806) (416372, 11772) (84093, 11772) STARTING UNMASKING ROUND 96 spa cc cxg2 96 precision recall f1-score support ar 0.76 0.79 0.78 5000 cl 0.89 0.87 0.88 5000 co 0.72 0.66 0.69 5000 cr 0.99 1.00 0.99 5000 cu 0.85 0.89 0.87 5000 ec 0.76 0.77 0.76 5000 es 0.83 0.85 0.84 5000 gt 0.79 0.83 0.81 5000 hn 0.72 0.73 0.72 5000 mx 0.75 0.70 0.72 5000 ni 0.69 0.52 0.59 4093 pa 0.78 0.79 0.79 5000 pe 0.73 0.73 0.73 5000 py 0.76 0.80 0.78 5000 sv 0.75 0.78 0.76 5000 uy 0.71 0.73 0.72 5000 ve 0.80 0.83 0.82 5000 avg / total 0.78 0.78 0.78 84093