Starting cxg2 and cc (16000, 21170) 16000 {'C': 0.01, 'loss': 'squared_hinge'} STARTING UNMASKING ROUND 1 deu cc cxg2 1 precision recall f1-score support at 0.95 0.95 0.95 5000 ch 0.94 0.94 0.94 5000 de 0.92 0.91 0.92 5000 lu 0.97 0.96 0.96 3768 pl 0.98 0.98 0.98 3262 pw 1.00 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.96 0.97 0.96 4365 avg / total 0.96 0.96 0.96 35240 Reducing feature vectors. (156693, 21170) (35240, 21170) (156693, 21157) (35240, 21157) STARTING UNMASKING ROUND 2 deu cc cxg2 2 precision recall f1-score support at 0.95 0.94 0.94 5000 ch 0.94 0.94 0.94 5000 de 0.91 0.90 0.91 5000 lu 0.96 0.95 0.96 3768 pl 0.98 0.98 0.98 3262 pw 1.00 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.95 0.96 0.96 4365 avg / total 0.96 0.96 0.96 35240 Reducing feature vectors. (156693, 21157) (35240, 21157) (156693, 21144) (35240, 21144) STARTING UNMASKING ROUND 3 deu cc cxg2 3 precision recall f1-score support at 0.94 0.94 0.94 5000 ch 0.93 0.94 0.94 5000 de 0.91 0.90 0.91 5000 lu 0.96 0.95 0.96 3768 pl 0.98 0.98 0.98 3262 pw 1.00 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.95 0.96 0.96 4365 avg / total 0.96 0.96 0.96 35240 Reducing feature vectors. (156693, 21144) (35240, 21144) (156693, 21129) (35240, 21129) STARTING UNMASKING ROUND 4 deu cc cxg2 4 precision recall f1-score support at 0.94 0.93 0.94 5000 ch 0.93 0.93 0.93 5000 de 0.91 0.90 0.90 5000 lu 0.96 0.95 0.95 3768 pl 0.98 0.99 0.98 3262 pw 1.00 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.95 0.96 0.96 4365 avg / total 0.95 0.96 0.96 35240 Reducing feature vectors. (156693, 21129) (35240, 21129) (156693, 21116) (35240, 21116) STARTING UNMASKING ROUND 5 deu cc cxg2 5 precision recall f1-score support at 0.94 0.93 0.93 5000 ch 0.93 0.94 0.93 5000 de 0.91 0.89 0.90 5000 lu 0.96 0.95 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.95 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21116) (35240, 21116) (156693, 21103) (35240, 21103) STARTING UNMASKING ROUND 6 deu cc cxg2 6 precision recall f1-score support at 0.93 0.93 0.93 5000 ch 0.92 0.93 0.93 5000 de 0.90 0.89 0.90 5000 lu 0.95 0.95 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21103) (35240, 21103) (156693, 21091) (35240, 21091) STARTING UNMASKING ROUND 7 deu cc cxg2 7 precision recall f1-score support at 0.94 0.93 0.93 5000 ch 0.92 0.93 0.92 5000 de 0.91 0.89 0.90 5000 lu 0.95 0.95 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21091) (35240, 21091) (156693, 21075) (35240, 21075) STARTING UNMASKING ROUND 8 deu cc cxg2 8 precision recall f1-score support at 0.93 0.93 0.93 5000 ch 0.92 0.93 0.92 5000 de 0.91 0.89 0.90 5000 lu 0.95 0.95 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21075) (35240, 21075) (156693, 21061) (35240, 21061) STARTING UNMASKING ROUND 9 deu cc cxg2 9 precision recall f1-score support at 0.93 0.92 0.93 5000 ch 0.92 0.92 0.92 5000 de 0.90 0.89 0.90 5000 lu 0.95 0.94 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21061) (35240, 21061) (156693, 21046) (35240, 21046) STARTING UNMASKING ROUND 10 deu cc cxg2 10 precision recall f1-score support at 0.93 0.92 0.92 5000 ch 0.91 0.92 0.92 5000 de 0.90 0.89 0.89 5000 lu 0.95 0.94 0.95 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21046) (35240, 21046) (156693, 21033) (35240, 21033) STARTING UNMASKING ROUND 11 deu cc cxg2 11 precision recall f1-score support at 0.92 0.92 0.92 5000 ch 0.91 0.92 0.92 5000 de 0.90 0.88 0.89 5000 lu 0.95 0.94 0.94 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21033) (35240, 21033) (156693, 21023) (35240, 21023) STARTING UNMASKING ROUND 12 deu cc cxg2 12 precision recall f1-score support at 0.92 0.92 0.92 5000 ch 0.91 0.92 0.91 5000 de 0.90 0.88 0.89 5000 lu 0.95 0.94 0.94 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.95 0.95 0.95 35240 Reducing feature vectors. (156693, 21023) (35240, 21023) (156693, 21007) (35240, 21007) STARTING UNMASKING ROUND 13 deu cc cxg2 13 precision recall f1-score support at 0.92 0.91 0.92 5000 ch 0.91 0.92 0.91 5000 de 0.90 0.88 0.89 5000 lu 0.94 0.94 0.94 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 1.00 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 21007) (35240, 21007) (156693, 20992) (35240, 20992) STARTING UNMASKING ROUND 14 deu cc cxg2 14 precision recall f1-score support at 0.92 0.91 0.91 5000 ch 0.91 0.91 0.91 5000 de 0.90 0.88 0.89 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20992) (35240, 20992) (156693, 20976) (35240, 20976) STARTING UNMASKING ROUND 15 deu cc cxg2 15 precision recall f1-score support at 0.92 0.91 0.91 5000 ch 0.90 0.91 0.91 5000 de 0.90 0.88 0.89 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.96 0.95 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20976) (35240, 20976) (156693, 20964) (35240, 20964) STARTING UNMASKING ROUND 16 deu cc cxg2 16 precision recall f1-score support at 0.91 0.91 0.91 5000 ch 0.90 0.91 0.91 5000 de 0.90 0.88 0.89 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.94 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20964) (35240, 20964) (156693, 20948) (35240, 20948) STARTING UNMASKING ROUND 17 deu cc cxg2 17 precision recall f1-score support at 0.91 0.91 0.91 5000 ch 0.90 0.91 0.90 5000 de 0.90 0.88 0.89 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20948) (35240, 20948) (156693, 20933) (35240, 20933) STARTING UNMASKING ROUND 18 deu cc cxg2 18 precision recall f1-score support at 0.91 0.91 0.91 5000 ch 0.90 0.90 0.90 5000 de 0.90 0.87 0.88 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20933) (35240, 20933) (156693, 20917) (35240, 20917) STARTING UNMASKING ROUND 19 deu cc cxg2 19 precision recall f1-score support at 0.91 0.91 0.91 5000 ch 0.90 0.90 0.90 5000 de 0.89 0.87 0.88 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20917) (35240, 20917) (156693, 20902) (35240, 20902) STARTING UNMASKING ROUND 20 deu cc cxg2 20 precision recall f1-score support at 0.91 0.91 0.91 5000 ch 0.90 0.90 0.90 5000 de 0.89 0.87 0.88 5000 lu 0.94 0.93 0.94 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20902) (35240, 20902) (156693, 20888) (35240, 20888) STARTING UNMASKING ROUND 21 deu cc cxg2 21 precision recall f1-score support at 0.91 0.90 0.91 5000 ch 0.90 0.90 0.90 5000 de 0.89 0.87 0.88 5000 lu 0.94 0.93 0.93 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20888) (35240, 20888) (156693, 20872) (35240, 20872) STARTING UNMASKING ROUND 22 deu cc cxg2 22 precision recall f1-score support at 0.91 0.90 0.91 5000 ch 0.89 0.90 0.90 5000 de 0.89 0.87 0.88 5000 lu 0.94 0.93 0.93 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20872) (35240, 20872) (156693, 20856) (35240, 20856) STARTING UNMASKING ROUND 23 deu cc cxg2 23 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.90 0.90 5000 de 0.89 0.87 0.88 5000 lu 0.94 0.92 0.93 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20856) (35240, 20856) (156693, 20840) (35240, 20840) STARTING UNMASKING ROUND 24 deu cc cxg2 24 precision recall f1-score support at 0.91 0.90 0.90 5000 ch 0.90 0.90 0.90 5000 de 0.88 0.87 0.87 5000 lu 0.94 0.93 0.93 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20840) (35240, 20840) (156693, 20824) (35240, 20824) STARTING UNMASKING ROUND 25 deu cc cxg2 25 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.90 0.90 5000 de 0.89 0.86 0.87 5000 lu 0.93 0.93 0.93 3768 pl 0.98 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.94 0.94 0.94 35240 Reducing feature vectors. (156693, 20824) (35240, 20824) (156693, 20808) (35240, 20808) STARTING UNMASKING ROUND 26 deu cc cxg2 26 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.93 3768 pl 0.97 0.98 0.98 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20808) (35240, 20808) (156693, 20793) (35240, 20793) STARTING UNMASKING ROUND 27 deu cc cxg2 27 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.93 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20793) (35240, 20793) (156693, 20778) (35240, 20778) STARTING UNMASKING ROUND 28 deu cc cxg2 28 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.93 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20778) (35240, 20778) (156693, 20766) (35240, 20766) STARTING UNMASKING ROUND 29 deu cc cxg2 29 precision recall f1-score support at 0.90 0.90 0.90 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20766) (35240, 20766) (156693, 20750) (35240, 20750) STARTING UNMASKING ROUND 30 deu cc cxg2 30 precision recall f1-score support at 0.89 0.90 0.89 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20750) (35240, 20750) (156693, 20734) (35240, 20734) STARTING UNMASKING ROUND 31 deu cc cxg2 31 precision recall f1-score support at 0.89 0.90 0.89 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20734) (35240, 20734) (156693, 20718) (35240, 20718) STARTING UNMASKING ROUND 32 deu cc cxg2 32 precision recall f1-score support at 0.89 0.90 0.89 5000 ch 0.89 0.89 0.89 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20718) (35240, 20718) (156693, 20702) (35240, 20702) STARTING UNMASKING ROUND 33 deu cc cxg2 33 precision recall f1-score support at 0.89 0.90 0.89 5000 ch 0.89 0.88 0.88 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20702) (35240, 20702) (156693, 20686) (35240, 20686) STARTING UNMASKING ROUND 34 deu cc cxg2 34 precision recall f1-score support at 0.89 0.89 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.92 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20686) (35240, 20686) (156693, 20670) (35240, 20670) STARTING UNMASKING ROUND 35 deu cc cxg2 35 precision recall f1-score support at 0.89 0.90 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.86 0.87 5000 lu 0.93 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20670) (35240, 20670) (156693, 20654) (35240, 20654) STARTING UNMASKING ROUND 36 deu cc cxg2 36 precision recall f1-score support at 0.88 0.89 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.85 0.87 5000 lu 0.93 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20654) (35240, 20654) (156693, 20639) (35240, 20639) STARTING UNMASKING ROUND 37 deu cc cxg2 37 precision recall f1-score support at 0.89 0.89 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.85 0.86 5000 lu 0.93 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20639) (35240, 20639) (156693, 20623) (35240, 20623) STARTING UNMASKING ROUND 38 deu cc cxg2 38 precision recall f1-score support at 0.89 0.89 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.85 0.86 5000 lu 0.93 0.91 0.92 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20623) (35240, 20623) (156693, 20607) (35240, 20607) STARTING UNMASKING ROUND 39 deu cc cxg2 39 precision recall f1-score support at 0.88 0.89 0.89 5000 ch 0.88 0.88 0.88 5000 de 0.88 0.85 0.86 5000 lu 0.92 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20607) (35240, 20607) (156693, 20592) (35240, 20592) STARTING UNMASKING ROUND 40 deu cc cxg2 40 precision recall f1-score support at 0.88 0.89 0.89 5000 ch 0.88 0.87 0.88 5000 de 0.88 0.85 0.86 5000 lu 0.92 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20592) (35240, 20592) (156693, 20577) (35240, 20577) STARTING UNMASKING ROUND 41 deu cc cxg2 41 precision recall f1-score support at 0.88 0.89 0.89 5000 ch 0.88 0.87 0.87 5000 de 0.88 0.85 0.86 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.95 0.93 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20577) (35240, 20577) (156693, 20561) (35240, 20561) STARTING UNMASKING ROUND 42 deu cc cxg2 42 precision recall f1-score support at 0.88 0.89 0.88 5000 ch 0.88 0.87 0.88 5000 de 0.87 0.85 0.86 5000 lu 0.92 0.91 0.92 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.93 0.93 0.93 35240 Reducing feature vectors. (156693, 20561) (35240, 20561) (156693, 20546) (35240, 20546) STARTING UNMASKING ROUND 43 deu cc cxg2 43 precision recall f1-score support at 0.88 0.89 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.87 0.85 0.86 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.95 0.94 4365 avg / total 0.92 0.93 0.93 35240 Reducing feature vectors. (156693, 20546) (35240, 20546) (156693, 20534) (35240, 20534) STARTING UNMASKING ROUND 44 deu cc cxg2 44 precision recall f1-score support at 0.88 0.89 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.87 0.85 0.86 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.93 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20534) (35240, 20534) (156693, 20518) (35240, 20518) STARTING UNMASKING ROUND 45 deu cc cxg2 45 precision recall f1-score support at 0.88 0.89 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.87 0.85 0.86 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.98 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20518) (35240, 20518) (156693, 20502) (35240, 20502) STARTING UNMASKING ROUND 46 deu cc cxg2 46 precision recall f1-score support at 0.88 0.88 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.87 0.84 0.85 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20502) (35240, 20502) (156693, 20487) (35240, 20487) STARTING UNMASKING ROUND 47 deu cc cxg2 47 precision recall f1-score support at 0.88 0.88 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.90 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20487) (35240, 20487) (156693, 20472) (35240, 20472) STARTING UNMASKING ROUND 48 deu cc cxg2 48 precision recall f1-score support at 0.87 0.88 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.91 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20472) (35240, 20472) (156693, 20458) (35240, 20458) STARTING UNMASKING ROUND 49 deu cc cxg2 49 precision recall f1-score support at 0.87 0.88 0.88 5000 ch 0.88 0.87 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.90 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20458) (35240, 20458) (156693, 20443) (35240, 20443) STARTING UNMASKING ROUND 50 deu cc cxg2 50 precision recall f1-score support at 0.87 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.90 0.91 3768 pl 0.96 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20443) (35240, 20443) (156693, 20427) (35240, 20427) STARTING UNMASKING ROUND 51 deu cc cxg2 51 precision recall f1-score support at 0.87 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.90 0.91 3768 pl 0.96 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20427) (35240, 20427) (156693, 20411) (35240, 20411) STARTING UNMASKING ROUND 52 deu cc cxg2 52 precision recall f1-score support at 0.87 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.86 0.84 0.85 5000 lu 0.92 0.90 0.91 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20411) (35240, 20411) (156693, 20395) (35240, 20395) STARTING UNMASKING ROUND 53 deu cc cxg2 53 precision recall f1-score support at 0.86 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.86 0.83 0.84 5000 lu 0.92 0.90 0.91 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20395) (35240, 20395) (156693, 20379) (35240, 20379) STARTING UNMASKING ROUND 54 deu cc cxg2 54 precision recall f1-score support at 0.86 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.90 0.91 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20379) (35240, 20379) (156693, 20364) (35240, 20364) STARTING UNMASKING ROUND 55 deu cc cxg2 55 precision recall f1-score support at 0.86 0.87 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.90 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20364) (35240, 20364) (156693, 20349) (35240, 20349) STARTING UNMASKING ROUND 56 deu cc cxg2 56 precision recall f1-score support at 0.86 0.88 0.87 5000 ch 0.87 0.86 0.87 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.90 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.99 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20349) (35240, 20349) (156693, 20333) (35240, 20333) STARTING UNMASKING ROUND 57 deu cc cxg2 57 precision recall f1-score support at 0.86 0.87 0.87 5000 ch 0.87 0.86 0.86 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.89 0.91 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20333) (35240, 20333) (156693, 20318) (35240, 20318) STARTING UNMASKING ROUND 58 deu cc cxg2 58 precision recall f1-score support at 0.86 0.87 0.86 5000 ch 0.87 0.86 0.87 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.92 0.92 0.92 35240 Reducing feature vectors. (156693, 20318) (35240, 20318) (156693, 20304) (35240, 20304) STARTING UNMASKING ROUND 59 deu cc cxg2 59 precision recall f1-score support at 0.86 0.87 0.86 5000 ch 0.87 0.86 0.86 5000 de 0.85 0.83 0.84 5000 lu 0.91 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.92 0.91 35240 Reducing feature vectors. (156693, 20304) (35240, 20304) (156693, 20292) (35240, 20292) STARTING UNMASKING ROUND 60 deu cc cxg2 60 precision recall f1-score support at 0.85 0.87 0.86 5000 ch 0.87 0.86 0.86 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20292) (35240, 20292) (156693, 20277) (35240, 20277) STARTING UNMASKING ROUND 61 deu cc cxg2 61 precision recall f1-score support at 0.85 0.87 0.86 5000 ch 0.87 0.85 0.86 5000 de 0.85 0.83 0.84 5000 lu 0.92 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20277) (35240, 20277) (156693, 20262) (35240, 20262) STARTING UNMASKING ROUND 62 deu cc cxg2 62 precision recall f1-score support at 0.85 0.87 0.86 5000 ch 0.87 0.85 0.86 5000 de 0.85 0.83 0.84 5000 lu 0.91 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20262) (35240, 20262) (156693, 20246) (35240, 20246) STARTING UNMASKING ROUND 63 deu cc cxg2 63 precision recall f1-score support at 0.85 0.87 0.86 5000 ch 0.87 0.85 0.86 5000 de 0.84 0.83 0.84 5000 lu 0.92 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20246) (35240, 20246) (156693, 20231) (35240, 20231) STARTING UNMASKING ROUND 64 deu cc cxg2 64 precision recall f1-score support at 0.85 0.87 0.86 5000 ch 0.87 0.85 0.86 5000 de 0.84 0.82 0.83 5000 lu 0.92 0.89 0.90 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20231) (35240, 20231) (156693, 20215) (35240, 20215) STARTING UNMASKING ROUND 65 deu cc cxg2 65 precision recall f1-score support at 0.85 0.86 0.86 5000 ch 0.86 0.85 0.86 5000 de 0.84 0.83 0.83 5000 lu 0.91 0.89 0.90 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20215) (35240, 20215) (156693, 20199) (35240, 20199) STARTING UNMASKING ROUND 66 deu cc cxg2 66 precision recall f1-score support at 0.85 0.86 0.86 5000 ch 0.86 0.85 0.86 5000 de 0.84 0.82 0.83 5000 lu 0.91 0.88 0.90 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20199) (35240, 20199) (156693, 20183) (35240, 20183) STARTING UNMASKING ROUND 67 deu cc cxg2 67 precision recall f1-score support at 0.85 0.86 0.85 5000 ch 0.86 0.85 0.86 5000 de 0.84 0.82 0.83 5000 lu 0.91 0.88 0.90 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20183) (35240, 20183) (156693, 20167) (35240, 20167) STARTING UNMASKING ROUND 68 deu cc cxg2 68 precision recall f1-score support at 0.85 0.86 0.85 5000 ch 0.86 0.85 0.86 5000 de 0.84 0.82 0.83 5000 lu 0.91 0.88 0.89 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20167) (35240, 20167) (156693, 20151) (35240, 20151) STARTING UNMASKING ROUND 69 deu cc cxg2 69 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.85 0.85 5000 de 0.84 0.82 0.83 5000 lu 0.90 0.88 0.89 3768 pl 0.97 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.92 0.94 0.93 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20151) (35240, 20151) (156693, 20135) (35240, 20135) STARTING UNMASKING ROUND 70 deu cc cxg2 70 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.84 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.97 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20135) (35240, 20135) (156693, 20119) (35240, 20119) STARTING UNMASKING ROUND 71 deu cc cxg2 71 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.84 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20119) (35240, 20119) (156693, 20103) (35240, 20103) STARTING UNMASKING ROUND 72 deu cc cxg2 72 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.85 0.85 5000 de 0.84 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20103) (35240, 20103) (156693, 20087) (35240, 20087) STARTING UNMASKING ROUND 73 deu cc cxg2 73 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.84 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20087) (35240, 20087) (156693, 20071) (35240, 20071) STARTING UNMASKING ROUND 74 deu cc cxg2 74 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.83 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20071) (35240, 20071) (156693, 20055) (35240, 20055) STARTING UNMASKING ROUND 75 deu cc cxg2 75 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.83 0.82 0.83 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20055) (35240, 20055) (156693, 20039) (35240, 20039) STARTING UNMASKING ROUND 76 deu cc cxg2 76 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.83 0.82 0.83 5000 lu 0.90 0.87 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.91 0.91 0.91 35240 Reducing feature vectors. (156693, 20039) (35240, 20039) (156693, 20023) (35240, 20023) STARTING UNMASKING ROUND 77 deu cc cxg2 77 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.84 0.85 5000 de 0.83 0.82 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.91 0.90 35240 Reducing feature vectors. (156693, 20023) (35240, 20023) (156693, 20008) (35240, 20008) STARTING UNMASKING ROUND 78 deu cc cxg2 78 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.86 0.85 0.85 5000 de 0.83 0.82 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.91 0.90 35240 Reducing feature vectors. (156693, 20008) (35240, 20008) (156693, 19992) (35240, 19992) STARTING UNMASKING ROUND 79 deu cc cxg2 79 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.85 0.85 0.85 5000 de 0.83 0.82 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19992) (35240, 19992) (156693, 19976) (35240, 19976) STARTING UNMASKING ROUND 80 deu cc cxg2 80 precision recall f1-score support at 0.84 0.86 0.85 5000 ch 0.85 0.85 0.85 5000 de 0.83 0.82 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.91 0.90 35240 Reducing feature vectors. (156693, 19976) (35240, 19976) (156693, 19960) (35240, 19960) STARTING UNMASKING ROUND 81 deu cc cxg2 81 precision recall f1-score support at 0.84 0.85 0.84 5000 ch 0.86 0.85 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19960) (35240, 19960) (156693, 19944) (35240, 19944) STARTING UNMASKING ROUND 82 deu cc cxg2 82 precision recall f1-score support at 0.84 0.85 0.85 5000 ch 0.85 0.85 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19944) (35240, 19944) (156693, 19929) (35240, 19929) STARTING UNMASKING ROUND 83 deu cc cxg2 83 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.86 0.85 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19929) (35240, 19929) (156693, 19914) (35240, 19914) STARTING UNMASKING ROUND 84 deu cc cxg2 84 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.85 0.85 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19914) (35240, 19914) (156693, 19901) (35240, 19901) STARTING UNMASKING ROUND 85 deu cc cxg2 85 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19901) (35240, 19901) (156693, 19887) (35240, 19887) STARTING UNMASKING ROUND 86 deu cc cxg2 86 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19887) (35240, 19887) (156693, 19871) (35240, 19871) STARTING UNMASKING ROUND 87 deu cc cxg2 87 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.86 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19871) (35240, 19871) (156693, 19855) (35240, 19855) STARTING UNMASKING ROUND 88 deu cc cxg2 88 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.86 0.84 0.85 5000 de 0.82 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.94 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19855) (35240, 19855) (156693, 19839) (35240, 19839) STARTING UNMASKING ROUND 89 deu cc cxg2 89 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.86 0.84 0.85 5000 de 0.82 0.81 0.82 5000 lu 0.90 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19839) (35240, 19839) (156693, 19823) (35240, 19823) STARTING UNMASKING ROUND 90 deu cc cxg2 90 precision recall f1-score support at 0.83 0.84 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.89 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19823) (35240, 19823) (156693, 19807) (35240, 19807) STARTING UNMASKING ROUND 91 deu cc cxg2 91 precision recall f1-score support at 0.83 0.85 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.82 0.81 0.82 5000 lu 0.89 0.86 0.88 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19807) (35240, 19807) (156693, 19791) (35240, 19791) STARTING UNMASKING ROUND 92 deu cc cxg2 92 precision recall f1-score support at 0.83 0.84 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.89 0.86 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19791) (35240, 19791) (156693, 19775) (35240, 19775) STARTING UNMASKING ROUND 93 deu cc cxg2 93 precision recall f1-score support at 0.83 0.84 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.83 0.81 0.82 5000 lu 0.89 0.86 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19775) (35240, 19775) (156693, 19759) (35240, 19759) STARTING UNMASKING ROUND 94 deu cc cxg2 94 precision recall f1-score support at 0.83 0.84 0.84 5000 ch 0.85 0.84 0.85 5000 de 0.82 0.81 0.82 5000 lu 0.89 0.86 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19759) (35240, 19759) (156693, 19743) (35240, 19743) STARTING UNMASKING ROUND 95 deu cc cxg2 95 precision recall f1-score support at 0.83 0.84 0.83 5000 ch 0.85 0.84 0.85 5000 de 0.82 0.81 0.81 5000 lu 0.89 0.85 0.87 3768 pl 0.96 0.97 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19743) (35240, 19743) (156693, 19727) (35240, 19727) STARTING UNMASKING ROUND 96 deu cc cxg2 96 precision recall f1-score support at 0.83 0.84 0.83 5000 ch 0.85 0.84 0.85 5000 de 0.82 0.81 0.81 5000 lu 0.89 0.85 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19727) (35240, 19727) (156693, 19711) (35240, 19711) STARTING UNMASKING ROUND 97 deu cc cxg2 97 precision recall f1-score support at 0.83 0.84 0.83 5000 ch 0.85 0.84 0.84 5000 de 0.82 0.80 0.81 5000 lu 0.89 0.85 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19711) (35240, 19711) (156693, 19695) (35240, 19695) STARTING UNMASKING ROUND 98 deu cc cxg2 98 precision recall f1-score support at 0.82 0.84 0.83 5000 ch 0.85 0.84 0.84 5000 de 0.82 0.80 0.81 5000 lu 0.89 0.85 0.87 3768 pl 0.96 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.91 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19695) (35240, 19695) (156693, 19682) (35240, 19682) STARTING UNMASKING ROUND 99 deu cc cxg2 99 precision recall f1-score support at 0.82 0.84 0.83 5000 ch 0.85 0.83 0.84 5000 de 0.82 0.80 0.81 5000 lu 0.88 0.84 0.86 3768 pl 0.95 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.90 0.93 0.92 4365 avg / total 0.90 0.90 0.90 35240 Reducing feature vectors. (156693, 19682) (35240, 19682) (156693, 19666) (35240, 19666) STARTING UNMASKING ROUND 100 deu cc cxg2 100 precision recall f1-score support at 0.82 0.84 0.83 5000 ch 0.85 0.83 0.84 5000 de 0.82 0.80 0.81 5000 lu 0.88 0.84 0.86 3768 pl 0.95 0.96 0.96 3262 pw 0.98 1.00 0.99 5000 so 1.00 1.00 1.00 3845 tl 0.90 0.93 0.92 4365 avg / total 0.89 0.90 0.89 35240 Reducing feature vectors. (156693, 19666) (35240, 19666) (156693, 19650) (35240, 19650)