Starting cxg2 and cc (21000, 21747) 21000 {'C': 0.01, 'loss': 'squared_hinge'} STARTING UNMASKING ROUND 1 fra cc cxg2 1 precision recall f1-score support be 0.94 0.86 0.90 1956 bf 0.98 0.98 0.98 2240 ch 0.92 0.93 0.93 5000 cm 1.00 1.00 1.00 1996 dz 0.99 0.99 0.99 2929 fr 0.92 0.95 0.93 5000 gd 0.94 0.92 0.93 1893 lu 0.97 0.96 0.96 5000 nc 0.96 0.95 0.95 3046 pf 0.97 0.97 0.97 5000 re 0.94 0.95 0.95 5000 sn 0.98 0.98 0.98 5000 tn 0.98 0.97 0.98 2502 avg / total 0.96 0.96 0.96 46562 Reducing feature vectors. (211402, 21747) (46562, 21747) (211402, 21730) (46562, 21730) STARTING UNMASKING ROUND 2 fra cc cxg2 2 precision recall f1-score support be 0.93 0.85 0.89 1956 bf 0.98 0.98 0.98 2240 ch 0.92 0.93 0.92 5000 cm 1.00 1.00 1.00 1996 dz 0.99 0.99 0.99 2929 fr 0.91 0.95 0.93 5000 gd 0.94 0.91 0.92 1893 lu 0.96 0.96 0.96 5000 nc 0.96 0.94 0.95 3046 pf 0.97 0.97 0.97 5000 re 0.93 0.95 0.94 5000 sn 0.98 0.98 0.98 5000 tn 0.98 0.97 0.97 2502 avg / total 0.95 0.95 0.95 46562 Reducing feature vectors. (211402, 21730) (46562, 21730) (211402, 21711) (46562, 21711) STARTING UNMASKING ROUND 3 fra cc cxg2 3 precision recall f1-score support be 0.93 0.83 0.88 1956 bf 0.97 0.97 0.97 2240 ch 0.91 0.93 0.92 5000 cm 1.00 1.00 1.00 1996 dz 0.98 0.98 0.98 2929 fr 0.91 0.94 0.93 5000 gd 0.93 0.91 0.92 1893 lu 0.96 0.95 0.96 5000 nc 0.95 0.94 0.95 3046 pf 0.96 0.96 0.96 5000 re 0.93 0.94 0.94 5000 sn 0.98 0.97 0.97 5000 tn 0.97 0.96 0.96 2502 avg / total 0.95 0.95 0.95 46562 Reducing feature vectors. (211402, 21711) (46562, 21711) (211402, 21690) (46562, 21690) STARTING UNMASKING ROUND 4 fra cc cxg2 4 precision recall f1-score support be 0.93 0.83 0.87 1956 bf 0.97 0.97 0.97 2240 ch 0.91 0.92 0.92 5000 cm 1.00 1.00 1.00 1996 dz 0.98 0.98 0.98 2929 fr 0.91 0.94 0.93 5000 gd 0.93 0.90 0.92 1893 lu 0.96 0.95 0.96 5000 nc 0.95 0.94 0.95 3046 pf 0.96 0.96 0.96 5000 re 0.93 0.94 0.93 5000 sn 0.97 0.97 0.97 5000 tn 0.96 0.95 0.96 2502 avg / total 0.95 0.95 0.95 46562 Reducing feature vectors. (211402, 21690) (46562, 21690) (211402, 21668) (46562, 21668) STARTING UNMASKING ROUND 5 fra cc cxg2 5 precision recall f1-score support be 0.92 0.82 0.87 1956 bf 0.97 0.97 0.97 2240 ch 0.90 0.92 0.91 5000 cm 1.00 1.00 1.00 1996 dz 0.98 0.98 0.98 2929 fr 0.91 0.94 0.92 5000 gd 0.93 0.90 0.92 1893 lu 0.96 0.95 0.96 5000 nc 0.95 0.93 0.94 3046 pf 0.95 0.96 0.96 5000 re 0.92 0.93 0.93 5000 sn 0.96 0.97 0.97 5000 tn 0.97 0.95 0.96 2502 avg / total 0.94 0.94 0.94 46562 Reducing feature vectors. (211402, 21668) (46562, 21668) (211402, 21643) (46562, 21643) STARTING UNMASKING ROUND 6 fra cc cxg2 6 precision recall f1-score support be 0.92 0.82 0.86 1956 bf 0.97 0.97 0.97 2240 ch 0.90 0.92 0.91 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.94 0.92 5000 gd 0.93 0.90 0.92 1893 lu 0.96 0.95 0.95 5000 nc 0.95 0.93 0.94 3046 pf 0.95 0.96 0.95 5000 re 0.92 0.93 0.92 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.94 0.94 0.94 46562 Reducing feature vectors. (211402, 21643) (46562, 21643) (211402, 21619) (46562, 21619) STARTING UNMASKING ROUND 7 fra cc cxg2 7 precision recall f1-score support be 0.91 0.81 0.86 1956 bf 0.97 0.96 0.97 2240 ch 0.89 0.92 0.90 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.94 0.92 5000 gd 0.94 0.90 0.92 1893 lu 0.95 0.95 0.95 5000 nc 0.95 0.93 0.94 3046 pf 0.95 0.96 0.95 5000 re 0.92 0.93 0.92 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.94 0.94 0.94 46562 Reducing feature vectors. (211402, 21619) (46562, 21619) (211402, 21598) (46562, 21598) STARTING UNMASKING ROUND 8 fra cc cxg2 8 precision recall f1-score support be 0.91 0.81 0.85 1956 bf 0.97 0.97 0.97 2240 ch 0.89 0.91 0.90 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.92 5000 gd 0.94 0.90 0.92 1893 lu 0.95 0.95 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.96 0.95 5000 re 0.91 0.93 0.92 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.96 2502 avg / total 0.94 0.94 0.94 46562 Reducing feature vectors. (211402, 21598) (46562, 21598) (211402, 21577) (46562, 21577) STARTING UNMASKING ROUND 9 fra cc cxg2 9 precision recall f1-score support be 0.90 0.80 0.85 1956 bf 0.97 0.97 0.97 2240 ch 0.88 0.91 0.90 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.92 5000 gd 0.93 0.89 0.91 1893 lu 0.95 0.95 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.95 0.95 5000 re 0.91 0.93 0.92 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.94 0.94 0.94 46562 Reducing feature vectors. (211402, 21577) (46562, 21577) (211402, 21552) (46562, 21552) STARTING UNMASKING ROUND 10 fra cc cxg2 10 precision recall f1-score support be 0.91 0.79 0.85 1956 bf 0.97 0.97 0.97 2240 ch 0.88 0.91 0.89 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.91 5000 gd 0.92 0.89 0.91 1893 lu 0.95 0.95 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.95 0.95 5000 re 0.90 0.92 0.91 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21552) (46562, 21552) (211402, 21526) (46562, 21526) STARTING UNMASKING ROUND 11 fra cc cxg2 11 precision recall f1-score support be 0.90 0.79 0.84 1956 bf 0.96 0.96 0.96 2240 ch 0.88 0.91 0.89 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.91 5000 gd 0.93 0.89 0.91 1893 lu 0.95 0.95 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.95 0.95 5000 re 0.90 0.92 0.91 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21526) (46562, 21526) (211402, 21502) (46562, 21502) STARTING UNMASKING ROUND 12 fra cc cxg2 12 precision recall f1-score support be 0.90 0.79 0.84 1956 bf 0.97 0.96 0.96 2240 ch 0.87 0.91 0.89 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.91 5000 gd 0.93 0.89 0.91 1893 lu 0.95 0.95 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.95 0.95 5000 re 0.90 0.92 0.91 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.95 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21502) (46562, 21502) (211402, 21477) (46562, 21477) STARTING UNMASKING ROUND 13 fra cc cxg2 13 precision recall f1-score support be 0.90 0.78 0.84 1956 bf 0.97 0.96 0.96 2240 ch 0.87 0.90 0.89 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.91 5000 gd 0.92 0.89 0.90 1893 lu 0.95 0.94 0.95 5000 nc 0.94 0.92 0.93 3046 pf 0.95 0.95 0.95 5000 re 0.90 0.92 0.91 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.94 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21477) (46562, 21477) (211402, 21452) (46562, 21452) STARTING UNMASKING ROUND 14 fra cc cxg2 14 precision recall f1-score support be 0.89 0.78 0.83 1956 bf 0.97 0.96 0.96 2240 ch 0.87 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.90 0.93 0.91 5000 gd 0.92 0.88 0.90 1893 lu 0.95 0.94 0.95 5000 nc 0.94 0.91 0.92 3046 pf 0.94 0.95 0.95 5000 re 0.90 0.91 0.90 5000 sn 0.96 0.96 0.96 5000 tn 0.96 0.94 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21452) (46562, 21452) (211402, 21428) (46562, 21428) STARTING UNMASKING ROUND 15 fra cc cxg2 15 precision recall f1-score support be 0.89 0.77 0.83 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.89 0.93 0.91 5000 gd 0.92 0.88 0.90 1893 lu 0.94 0.94 0.94 5000 nc 0.94 0.91 0.92 3046 pf 0.94 0.95 0.94 5000 re 0.90 0.91 0.90 5000 sn 0.95 0.96 0.96 5000 tn 0.96 0.93 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21428) (46562, 21428) (211402, 21403) (46562, 21403) STARTING UNMASKING ROUND 16 fra cc cxg2 16 precision recall f1-score support be 0.88 0.76 0.82 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.89 0.93 0.91 5000 gd 0.92 0.88 0.90 1893 lu 0.95 0.94 0.94 5000 nc 0.94 0.91 0.92 3046 pf 0.94 0.94 0.94 5000 re 0.89 0.91 0.90 5000 sn 0.95 0.96 0.96 5000 tn 0.96 0.94 0.95 2502 avg / total 0.93 0.93 0.93 46562 Reducing feature vectors. (211402, 21403) (46562, 21403) (211402, 21377) (46562, 21377) STARTING UNMASKING ROUND 17 fra cc cxg2 17 precision recall f1-score support be 0.88 0.76 0.82 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.89 0.92 0.91 5000 gd 0.92 0.87 0.90 1893 lu 0.94 0.94 0.94 5000 nc 0.94 0.91 0.92 3046 pf 0.94 0.94 0.94 5000 re 0.89 0.91 0.90 5000 sn 0.95 0.96 0.96 5000 tn 0.96 0.94 0.95 2502 avg / total 0.93 0.93 0.92 46562 Reducing feature vectors. (211402, 21377) (46562, 21377) (211402, 21354) (46562, 21354) STARTING UNMASKING ROUND 18 fra cc cxg2 18 precision recall f1-score support be 0.88 0.76 0.82 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.97 0.97 2929 fr 0.89 0.93 0.91 5000 gd 0.91 0.87 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.94 0.91 0.92 3046 pf 0.94 0.94 0.94 5000 re 0.89 0.90 0.90 5000 sn 0.95 0.95 0.95 5000 tn 0.96 0.93 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21354) (46562, 21354) (211402, 21330) (46562, 21330) STARTING UNMASKING ROUND 19 fra cc cxg2 19 precision recall f1-score support be 0.88 0.75 0.81 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.90 0.88 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.89 0.92 0.90 5000 gd 0.92 0.87 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.91 0.92 3046 pf 0.94 0.94 0.94 5000 re 0.89 0.90 0.90 5000 sn 0.95 0.96 0.95 5000 tn 0.96 0.93 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21330) (46562, 21330) (211402, 21307) (46562, 21307) STARTING UNMASKING ROUND 20 fra cc cxg2 20 precision recall f1-score support be 0.88 0.75 0.81 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.89 0.87 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.96 0.97 2929 fr 0.88 0.93 0.90 5000 gd 0.92 0.87 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.94 0.91 0.92 3046 pf 0.93 0.93 0.93 5000 re 0.89 0.90 0.89 5000 sn 0.95 0.95 0.95 5000 tn 0.96 0.93 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21307) (46562, 21307) (211402, 21281) (46562, 21281) STARTING UNMASKING ROUND 21 fra cc cxg2 21 precision recall f1-score support be 0.88 0.74 0.80 1956 bf 0.96 0.96 0.96 2240 ch 0.86 0.89 0.87 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.96 0.97 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.91 0.92 3046 pf 0.93 0.94 0.93 5000 re 0.88 0.89 0.89 5000 sn 0.95 0.95 0.95 5000 tn 0.96 0.93 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21281) (46562, 21281) (211402, 21255) (46562, 21255) STARTING UNMASKING ROUND 22 fra cc cxg2 22 precision recall f1-score support be 0.87 0.74 0.80 1956 bf 0.96 0.95 0.96 2240 ch 0.85 0.89 0.87 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.96 0.96 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.88 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.92 3046 pf 0.93 0.93 0.93 5000 re 0.88 0.89 0.89 5000 sn 0.94 0.95 0.95 5000 tn 0.96 0.93 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21255) (46562, 21255) (211402, 21229) (46562, 21229) STARTING UNMASKING ROUND 23 fra cc cxg2 23 precision recall f1-score support be 0.87 0.74 0.80 1956 bf 0.96 0.95 0.96 2240 ch 0.85 0.89 0.87 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.97 0.97 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.92 3046 pf 0.93 0.93 0.93 5000 re 0.88 0.89 0.89 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21229) (46562, 21229) (211402, 21204) (46562, 21204) STARTING UNMASKING ROUND 24 fra cc cxg2 24 precision recall f1-score support be 0.87 0.74 0.80 1956 bf 0.96 0.95 0.96 2240 ch 0.85 0.89 0.87 5000 cm 1.00 1.00 1.00 1996 dz 0.97 0.96 0.96 2929 fr 0.88 0.92 0.90 5000 gd 0.92 0.86 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.91 3046 pf 0.93 0.93 0.93 5000 re 0.88 0.89 0.88 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.94 2502 avg / total 0.92 0.92 0.92 46562 Reducing feature vectors. (211402, 21204) (46562, 21204) (211402, 21179) (46562, 21179) STARTING UNMASKING ROUND 25 fra cc cxg2 25 precision recall f1-score support be 0.86 0.73 0.79 1956 bf 0.96 0.95 0.96 2240 ch 0.84 0.88 0.86 5000 cm 0.99 1.00 1.00 1996 dz 0.97 0.96 0.97 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.89 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.91 3046 pf 0.92 0.93 0.93 5000 re 0.88 0.89 0.88 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.94 2502 avg / total 0.92 0.91 0.91 46562 Reducing feature vectors. (211402, 21179) (46562, 21179) (211402, 21155) (46562, 21155) STARTING UNMASKING ROUND 26 fra cc cxg2 26 precision recall f1-score support be 0.87 0.73 0.79 1956 bf 0.96 0.95 0.96 2240 ch 0.84 0.88 0.86 5000 cm 0.99 1.00 1.00 1996 dz 0.97 0.96 0.96 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.88 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.91 3046 pf 0.92 0.93 0.93 5000 re 0.88 0.89 0.88 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.94 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21155) (46562, 21155) (211402, 21129) (46562, 21129) STARTING UNMASKING ROUND 27 fra cc cxg2 27 precision recall f1-score support be 0.87 0.73 0.79 1956 bf 0.96 0.95 0.96 2240 ch 0.84 0.88 0.86 5000 cm 0.99 1.00 1.00 1996 dz 0.97 0.96 0.96 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.86 0.88 1893 lu 0.94 0.94 0.94 5000 nc 0.93 0.90 0.91 3046 pf 0.92 0.93 0.92 5000 re 0.88 0.89 0.88 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.94 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21129) (46562, 21129) (211402, 21103) (46562, 21103) STARTING UNMASKING ROUND 28 fra cc cxg2 28 precision recall f1-score support be 0.86 0.72 0.79 1956 bf 0.96 0.95 0.95 2240 ch 0.84 0.88 0.86 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.96 0.96 2929 fr 0.88 0.92 0.90 5000 gd 0.91 0.85 0.88 1893 lu 0.94 0.94 0.94 5000 nc 0.92 0.90 0.91 3046 pf 0.92 0.93 0.92 5000 re 0.87 0.89 0.88 5000 sn 0.94 0.95 0.95 5000 tn 0.95 0.92 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21103) (46562, 21103) (211402, 21080) (46562, 21080) STARTING UNMASKING ROUND 29 fra cc cxg2 29 precision recall f1-score support be 0.85 0.72 0.78 1956 bf 0.95 0.95 0.95 2240 ch 0.84 0.88 0.86 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.87 0.92 0.89 5000 gd 0.90 0.85 0.88 1893 lu 0.93 0.94 0.94 5000 nc 0.92 0.89 0.91 3046 pf 0.92 0.93 0.92 5000 re 0.87 0.88 0.88 5000 sn 0.94 0.95 0.94 5000 tn 0.95 0.92 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21080) (46562, 21080) (211402, 21054) (46562, 21054) STARTING UNMASKING ROUND 30 fra cc cxg2 30 precision recall f1-score support be 0.85 0.72 0.78 1956 bf 0.95 0.95 0.95 2240 ch 0.84 0.88 0.86 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.85 0.88 1893 lu 0.93 0.94 0.94 5000 nc 0.92 0.89 0.91 3046 pf 0.92 0.92 0.92 5000 re 0.87 0.88 0.88 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.92 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21054) (46562, 21054) (211402, 21029) (46562, 21029) STARTING UNMASKING ROUND 31 fra cc cxg2 31 precision recall f1-score support be 0.86 0.72 0.78 1956 bf 0.95 0.95 0.95 2240 ch 0.84 0.88 0.86 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.85 0.88 1893 lu 0.93 0.94 0.94 5000 nc 0.92 0.89 0.91 3046 pf 0.92 0.92 0.92 5000 re 0.87 0.88 0.87 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21029) (46562, 21029) (211402, 21003) (46562, 21003) STARTING UNMASKING ROUND 32 fra cc cxg2 32 precision recall f1-score support be 0.85 0.72 0.78 1956 bf 0.95 0.95 0.95 2240 ch 0.84 0.88 0.86 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.88 0.91 0.89 5000 gd 0.90 0.85 0.88 1893 lu 0.93 0.94 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.88 0.88 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 21003) (46562, 21003) (211402, 20981) (46562, 20981) STARTING UNMASKING ROUND 33 fra cc cxg2 33 precision recall f1-score support be 0.85 0.72 0.78 1956 bf 0.95 0.95 0.95 2240 ch 0.83 0.87 0.85 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.88 0.91 0.89 5000 gd 0.91 0.85 0.88 1893 lu 0.93 0.93 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.92 0.92 0.92 5000 re 0.87 0.88 0.88 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 20981) (46562, 20981) (211402, 20955) (46562, 20955) STARTING UNMASKING ROUND 34 fra cc cxg2 34 precision recall f1-score support be 0.84 0.71 0.77 1956 bf 0.95 0.95 0.95 2240 ch 0.83 0.87 0.85 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.85 0.87 1893 lu 0.93 0.93 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.88 0.87 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 20955) (46562, 20955) (211402, 20930) (46562, 20930) STARTING UNMASKING ROUND 35 fra cc cxg2 35 precision recall f1-score support be 0.84 0.71 0.77 1956 bf 0.95 0.94 0.95 2240 ch 0.83 0.87 0.85 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.96 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.85 0.88 1893 lu 0.93 0.93 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.88 0.87 5000 sn 0.93 0.94 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.91 0.91 46562 Reducing feature vectors. (211402, 20930) (46562, 20930) (211402, 20906) (46562, 20906) STARTING UNMASKING ROUND 36 fra cc cxg2 36 precision recall f1-score support be 0.84 0.71 0.77 1956 bf 0.95 0.94 0.95 2240 ch 0.83 0.87 0.85 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.91 0.85 0.88 1893 lu 0.92 0.93 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.88 0.87 5000 sn 0.93 0.94 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.91 0.90 0.90 46562 Reducing feature vectors. (211402, 20906) (46562, 20906) (211402, 20880) (46562, 20880) STARTING UNMASKING ROUND 37 fra cc cxg2 37 precision recall f1-score support be 0.84 0.70 0.76 1956 bf 0.95 0.94 0.95 2240 ch 0.83 0.87 0.85 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.91 0.85 0.88 1893 lu 0.92 0.93 0.93 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.88 0.87 5000 sn 0.93 0.95 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20880) (46562, 20880) (211402, 20855) (46562, 20855) STARTING UNMASKING ROUND 38 fra cc cxg2 38 precision recall f1-score support be 0.85 0.70 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.83 0.87 0.85 5000 cm 1.00 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.91 0.84 0.87 1893 lu 0.92 0.93 0.92 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.92 5000 re 0.87 0.87 0.87 5000 sn 0.93 0.94 0.94 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20855) (46562, 20855) (211402, 20830) (46562, 20830) STARTING UNMASKING ROUND 39 fra cc cxg2 39 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.84 0.87 1893 lu 0.92 0.93 0.92 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.91 5000 re 0.87 0.87 0.87 5000 sn 0.93 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20830) (46562, 20830) (211402, 20805) (46562, 20805) STARTING UNMASKING ROUND 40 fra cc cxg2 40 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.84 0.87 1893 lu 0.92 0.93 0.92 5000 nc 0.92 0.89 0.91 3046 pf 0.91 0.92 0.91 5000 re 0.87 0.87 0.87 5000 sn 0.93 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20805) (46562, 20805) (211402, 20780) (46562, 20780) STARTING UNMASKING ROUND 41 fra cc cxg2 41 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.83 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.87 0.91 0.89 5000 gd 0.90 0.84 0.87 1893 lu 0.92 0.93 0.92 5000 nc 0.92 0.89 0.90 3046 pf 0.91 0.92 0.91 5000 re 0.87 0.87 0.87 5000 sn 0.93 0.94 0.93 5000 tn 0.94 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20780) (46562, 20780) (211402, 20755) (46562, 20755) STARTING UNMASKING ROUND 42 fra cc cxg2 42 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.95 0.95 0.95 2929 fr 0.86 0.91 0.89 5000 gd 0.90 0.84 0.87 1893 lu 0.92 0.93 0.92 5000 nc 0.92 0.89 0.90 3046 pf 0.91 0.92 0.91 5000 re 0.86 0.87 0.87 5000 sn 0.93 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20755) (46562, 20755) (211402, 20732) (46562, 20732) STARTING UNMASKING ROUND 43 fra cc cxg2 43 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.92 0.89 0.90 3046 pf 0.90 0.92 0.91 5000 re 0.86 0.87 0.86 5000 sn 0.92 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20732) (46562, 20732) (211402, 20706) (46562, 20706) STARTING UNMASKING ROUND 44 fra cc cxg2 44 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.86 0.91 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.92 0.89 0.90 3046 pf 0.91 0.92 0.91 5000 re 0.86 0.87 0.86 5000 sn 0.92 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20706) (46562, 20706) (211402, 20680) (46562, 20680) STARTING UNMASKING ROUND 45 fra cc cxg2 45 precision recall f1-score support be 0.84 0.69 0.76 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.96 0.95 0.95 2929 fr 0.86 0.91 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.92 0.89 0.90 3046 pf 0.91 0.92 0.91 5000 re 0.86 0.87 0.86 5000 sn 0.92 0.94 0.93 5000 tn 0.95 0.91 0.93 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20680) (46562, 20680) (211402, 20657) (46562, 20657) STARTING UNMASKING ROUND 46 fra cc cxg2 46 precision recall f1-score support be 0.83 0.69 0.75 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.95 0.95 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.91 0.89 0.90 3046 pf 0.90 0.91 0.91 5000 re 0.86 0.86 0.86 5000 sn 0.93 0.94 0.93 5000 tn 0.95 0.90 0.92 2502 avg / total 0.90 0.90 0.90 46562 Reducing feature vectors. (211402, 20657) (46562, 20657) (211402, 20635) (46562, 20635) STARTING UNMASKING ROUND 47 fra cc cxg2 47 precision recall f1-score support be 0.83 0.68 0.75 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.95 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.91 0.88 0.90 3046 pf 0.90 0.91 0.90 5000 re 0.85 0.86 0.85 5000 sn 0.92 0.94 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20635) (46562, 20635) (211402, 20609) (46562, 20609) STARTING UNMASKING ROUND 48 fra cc cxg2 48 precision recall f1-score support be 0.84 0.68 0.75 1956 bf 0.95 0.94 0.94 2240 ch 0.82 0.85 0.84 5000 cm 0.99 1.00 1.00 1996 dz 0.95 0.95 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.90 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.91 0.88 0.90 3046 pf 0.89 0.90 0.90 5000 re 0.85 0.86 0.85 5000 sn 0.92 0.94 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20609) (46562, 20609) (211402, 20583) (46562, 20583) STARTING UNMASKING ROUND 49 fra cc cxg2 49 precision recall f1-score support be 0.84 0.68 0.75 1956 bf 0.95 0.93 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.95 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.91 0.88 0.90 3046 pf 0.89 0.90 0.90 5000 re 0.85 0.86 0.85 5000 sn 0.92 0.94 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20583) (46562, 20583) (211402, 20558) (46562, 20558) STARTING UNMASKING ROUND 50 fra cc cxg2 50 precision recall f1-score support be 0.84 0.68 0.75 1956 bf 0.95 0.93 0.94 2240 ch 0.82 0.86 0.84 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.87 1893 lu 0.91 0.92 0.92 5000 nc 0.91 0.88 0.89 3046 pf 0.89 0.90 0.90 5000 re 0.84 0.86 0.85 5000 sn 0.92 0.94 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20558) (46562, 20558) (211402, 20533) (46562, 20533) STARTING UNMASKING ROUND 51 fra cc cxg2 51 precision recall f1-score support be 0.83 0.67 0.74 1956 bf 0.95 0.93 0.94 2240 ch 0.82 0.85 0.83 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.86 1893 lu 0.91 0.92 0.91 5000 nc 0.91 0.88 0.89 3046 pf 0.89 0.90 0.90 5000 re 0.84 0.85 0.85 5000 sn 0.92 0.94 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20533) (46562, 20533) (211402, 20510) (46562, 20510) STARTING UNMASKING ROUND 52 fra cc cxg2 52 precision recall f1-score support be 0.83 0.67 0.74 1956 bf 0.95 0.93 0.94 2240 ch 0.81 0.85 0.83 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.86 1893 lu 0.90 0.92 0.91 5000 nc 0.91 0.88 0.89 3046 pf 0.89 0.90 0.90 5000 re 0.84 0.85 0.85 5000 sn 0.92 0.93 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20510) (46562, 20510) (211402, 20485) (46562, 20485) STARTING UNMASKING ROUND 53 fra cc cxg2 53 precision recall f1-score support be 0.83 0.67 0.74 1956 bf 0.94 0.93 0.94 2240 ch 0.81 0.85 0.83 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.86 1893 lu 0.90 0.91 0.91 5000 nc 0.91 0.88 0.89 3046 pf 0.89 0.90 0.90 5000 re 0.84 0.85 0.85 5000 sn 0.92 0.93 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20485) (46562, 20485) (211402, 20459) (46562, 20459) STARTING UNMASKING ROUND 54 fra cc cxg2 54 precision recall f1-score support be 0.83 0.67 0.74 1956 bf 0.95 0.93 0.94 2240 ch 0.81 0.85 0.83 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.86 1893 lu 0.90 0.91 0.91 5000 nc 0.91 0.88 0.89 3046 pf 0.89 0.90 0.90 5000 re 0.84 0.85 0.85 5000 sn 0.92 0.93 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20459) (46562, 20459) (211402, 20436) (46562, 20436) STARTING UNMASKING ROUND 55 fra cc cxg2 55 precision recall f1-score support be 0.82 0.67 0.74 1956 bf 0.95 0.93 0.94 2240 ch 0.81 0.85 0.83 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.84 0.86 1893 lu 0.90 0.91 0.91 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.90 0.89 5000 re 0.84 0.85 0.85 5000 sn 0.92 0.93 0.93 5000 tn 0.94 0.90 0.92 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20436) (46562, 20436) (211402, 20411) (46562, 20411) STARTING UNMASKING ROUND 56 fra cc cxg2 56 precision recall f1-score support be 0.82 0.67 0.74 1956 bf 0.95 0.93 0.94 2240 ch 0.81 0.85 0.83 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.95 2929 fr 0.86 0.90 0.88 5000 gd 0.88 0.83 0.86 1893 lu 0.90 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.90 0.89 5000 re 0.84 0.85 0.84 5000 sn 0.92 0.93 0.92 5000 tn 0.94 0.89 0.91 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20411) (46562, 20411) (211402, 20387) (46562, 20387) STARTING UNMASKING ROUND 57 fra cc cxg2 57 precision recall f1-score support be 0.82 0.66 0.73 1956 bf 0.95 0.93 0.94 2240 ch 0.81 0.84 0.83 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.86 0.90 0.88 5000 gd 0.89 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.90 0.89 5000 re 0.84 0.85 0.84 5000 sn 0.92 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.89 0.89 0.89 46562 Reducing feature vectors. (211402, 20387) (46562, 20387) (211402, 20361) (46562, 20361) STARTING UNMASKING ROUND 58 fra cc cxg2 58 precision recall f1-score support be 0.82 0.66 0.73 1956 bf 0.94 0.93 0.94 2240 ch 0.81 0.84 0.83 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.90 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.89 0.89 5000 re 0.84 0.85 0.84 5000 sn 0.92 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.89 0.88 0.88 46562 Reducing feature vectors. (211402, 20361) (46562, 20361) (211402, 20337) (46562, 20337) STARTING UNMASKING ROUND 59 fra cc cxg2 59 precision recall f1-score support be 0.82 0.66 0.73 1956 bf 0.94 0.92 0.93 2240 ch 0.81 0.84 0.82 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.90 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.90 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.92 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.89 0.88 0.88 46562 Reducing feature vectors. (211402, 20337) (46562, 20337) (211402, 20311) (46562, 20311) STARTING UNMASKING ROUND 60 fra cc cxg2 60 precision recall f1-score support be 0.82 0.65 0.73 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.94 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20311) (46562, 20311) (211402, 20285) (46562, 20285) STARTING UNMASKING ROUND 61 fra cc cxg2 61 precision recall f1-score support be 0.83 0.65 0.73 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20285) (46562, 20285) (211402, 20261) (46562, 20261) STARTING UNMASKING ROUND 62 fra cc cxg2 62 precision recall f1-score support be 0.82 0.65 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.90 0.87 0.89 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20261) (46562, 20261) (211402, 20237) (46562, 20237) STARTING UNMASKING ROUND 63 fra cc cxg2 63 precision recall f1-score support be 0.82 0.65 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.91 0.87 0.89 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20237) (46562, 20237) (211402, 20211) (46562, 20211) STARTING UNMASKING ROUND 64 fra cc cxg2 64 precision recall f1-score support be 0.82 0.65 0.73 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20211) (46562, 20211) (211402, 20185) (46562, 20185) STARTING UNMASKING ROUND 65 fra cc cxg2 65 precision recall f1-score support be 0.82 0.65 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.84 0.82 5000 cm 0.99 0.99 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.86 1893 lu 0.89 0.91 0.90 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.85 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.89 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20185) (46562, 20185) (211402, 20161) (46562, 20161) STARTING UNMASKING ROUND 66 fra cc cxg2 66 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.83 0.82 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.83 0.85 1893 lu 0.89 0.91 0.90 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.84 0.84 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.88 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20161) (46562, 20161) (211402, 20136) (46562, 20136) STARTING UNMASKING ROUND 67 fra cc cxg2 67 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.83 0.81 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.86 1893 lu 0.88 0.91 0.90 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.83 0.84 0.83 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.88 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20136) (46562, 20136) (211402, 20111) (46562, 20111) STARTING UNMASKING ROUND 68 fra cc cxg2 68 precision recall f1-score support be 0.81 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.83 0.81 5000 cm 0.99 1.00 0.99 1996 dz 0.95 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.85 1893 lu 0.88 0.91 0.90 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.88 0.91 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20111) (46562, 20111) (211402, 20085) (46562, 20085) STARTING UNMASKING ROUND 69 fra cc cxg2 69 precision recall f1-score support be 0.82 0.65 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.83 0.81 5000 cm 0.99 1.00 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.86 1893 lu 0.88 0.91 0.89 5000 nc 0.90 0.86 0.88 3046 pf 0.89 0.89 0.89 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.93 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20085) (46562, 20085) (211402, 20059) (46562, 20059) STARTING UNMASKING ROUND 70 fra cc cxg2 70 precision recall f1-score support be 0.81 0.65 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.80 0.83 0.81 5000 cm 0.99 1.00 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.86 1893 lu 0.88 0.91 0.89 5000 nc 0.90 0.86 0.88 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20059) (46562, 20059) (211402, 20034) (46562, 20034) STARTING UNMASKING ROUND 71 fra cc cxg2 71 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.86 1893 lu 0.88 0.91 0.89 5000 nc 0.90 0.86 0.88 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20034) (46562, 20034) (211402, 20008) (46562, 20008) STARTING UNMASKING ROUND 72 fra cc cxg2 72 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.83 0.86 1893 lu 0.88 0.91 0.89 5000 nc 0.90 0.86 0.88 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 20008) (46562, 20008) (211402, 19982) (46562, 19982) STARTING UNMASKING ROUND 73 fra cc cxg2 73 precision recall f1-score support be 0.81 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.82 0.85 1893 lu 0.88 0.91 0.89 5000 nc 0.90 0.86 0.88 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 19982) (46562, 19982) (211402, 19957) (46562, 19957) STARTING UNMASKING ROUND 74 fra cc cxg2 74 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.88 0.82 0.85 1893 lu 0.88 0.91 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.92 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.88 46562 Reducing feature vectors. (211402, 19957) (46562, 19957) (211402, 19932) (46562, 19932) STARTING UNMASKING ROUND 75 fra cc cxg2 75 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.84 0.89 0.87 5000 gd 0.88 0.82 0.85 1893 lu 0.88 0.91 0.89 5000 nc 0.89 0.86 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.91 0.92 0.91 5000 tn 0.92 0.88 0.90 2502 avg / total 0.88 0.87 0.87 46562 Reducing feature vectors. (211402, 19932) (46562, 19932) (211402, 19906) (46562, 19906) STARTING UNMASKING ROUND 76 fra cc cxg2 76 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.93 0.92 0.93 2240 ch 0.79 0.83 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.84 0.89 0.87 5000 gd 0.89 0.82 0.85 1893 lu 0.88 0.91 0.89 5000 nc 0.89 0.86 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.90 0.92 0.91 5000 tn 0.93 0.88 0.90 2502 avg / total 0.88 0.88 0.87 46562 Reducing feature vectors. (211402, 19906) (46562, 19906) (211402, 19882) (46562, 19882) STARTING UNMASKING ROUND 77 fra cc cxg2 77 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.93 0.92 0.93 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.85 0.89 0.87 5000 gd 0.89 0.82 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.89 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.88 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19882) (46562, 19882) (211402, 19856) (46562, 19856) STARTING UNMASKING ROUND 78 fra cc cxg2 78 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.93 0.92 0.93 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.84 0.89 0.87 5000 gd 0.89 0.82 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.89 0.88 0.88 5000 re 0.81 0.84 0.83 5000 sn 0.90 0.92 0.91 5000 tn 0.93 0.88 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19856) (46562, 19856) (211402, 19830) (46562, 19830) STARTING UNMASKING ROUND 79 fra cc cxg2 79 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.87 5000 gd 0.88 0.82 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.88 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19830) (46562, 19830) (211402, 19804) (46562, 19804) STARTING UNMASKING ROUND 80 fra cc cxg2 80 precision recall f1-score support be 0.82 0.64 0.72 1956 bf 0.93 0.92 0.92 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.84 0.89 0.87 5000 gd 0.89 0.82 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.84 0.83 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19804) (46562, 19804) (211402, 19779) (46562, 19779) STARTING UNMASKING ROUND 81 fra cc cxg2 81 precision recall f1-score support be 0.81 0.64 0.72 1956 bf 0.94 0.92 0.93 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.89 0.81 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19779) (46562, 19779) (211402, 19754) (46562, 19754) STARTING UNMASKING ROUND 82 fra cc cxg2 82 precision recall f1-score support be 0.81 0.64 0.72 1956 bf 0.93 0.92 0.93 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.89 0.81 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.88 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19754) (46562, 19754) (211402, 19728) (46562, 19728) STARTING UNMASKING ROUND 83 fra cc cxg2 83 precision recall f1-score support be 0.81 0.63 0.71 1956 bf 0.93 0.92 0.92 2240 ch 0.79 0.82 0.81 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.85 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19728) (46562, 19728) (211402, 19702) (46562, 19702) STARTING UNMASKING ROUND 84 fra cc cxg2 84 precision recall f1-score support be 0.81 0.63 0.71 1956 bf 0.93 0.92 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.82 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19702) (46562, 19702) (211402, 19677) (46562, 19677) STARTING UNMASKING ROUND 85 fra cc cxg2 85 precision recall f1-score support be 0.81 0.63 0.71 1956 bf 0.93 0.91 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 1.00 0.99 1996 dz 0.94 0.94 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.88 0.90 0.89 5000 nc 0.89 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19677) (46562, 19677) (211402, 19652) (46562, 19652) STARTING UNMASKING ROUND 86 fra cc cxg2 86 precision recall f1-score support be 0.81 0.64 0.71 1956 bf 0.93 0.91 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 1.00 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.88 0.90 0.89 5000 nc 0.88 0.85 0.87 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.90 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19652) (46562, 19652) (211402, 19627) (46562, 19627) STARTING UNMASKING ROUND 87 fra cc cxg2 87 precision recall f1-score support be 0.80 0.63 0.70 1956 bf 0.93 0.92 0.92 2240 ch 0.78 0.82 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.85 0.86 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.89 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19627) (46562, 19627) (211402, 19601) (46562, 19601) STARTING UNMASKING ROUND 88 fra cc cxg2 88 precision recall f1-score support be 0.80 0.63 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.87 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.88 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.87 0.89 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19601) (46562, 19601) (211402, 19577) (46562, 19577) STARTING UNMASKING ROUND 89 fra cc cxg2 89 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.88 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.87 0.88 0.88 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.86 0.89 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19577) (46562, 19577) (211402, 19553) (46562, 19553) STARTING UNMASKING ROUND 90 fra cc cxg2 90 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.79 0.82 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.88 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.87 0.88 0.87 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.86 0.89 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19553) (46562, 19553) (211402, 19527) (46562, 19527) STARTING UNMASKING ROUND 91 fra cc cxg2 91 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.94 0.91 0.92 2240 ch 0.78 0.81 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.84 0.89 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.87 0.88 0.87 5000 re 0.81 0.83 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.86 0.89 2502 avg / total 0.87 0.87 0.87 46562 Reducing feature vectors. (211402, 19527) (46562, 19527) (211402, 19501) (46562, 19501) STARTING UNMASKING ROUND 92 fra cc cxg2 92 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.94 2929 fr 0.84 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.81 0.82 0.82 5000 sn 0.90 0.92 0.91 5000 tn 0.92 0.86 0.89 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19501) (46562, 19501) (211402, 19475) (46562, 19475) STARTING UNMASKING ROUND 93 fra cc cxg2 93 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.94 0.91 0.92 2240 ch 0.78 0.81 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.88 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.81 0.82 0.82 5000 sn 0.89 0.92 0.91 5000 tn 0.92 0.87 0.89 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19475) (46562, 19475) (211402, 19450) (46562, 19450) STARTING UNMASKING ROUND 94 fra cc cxg2 94 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.88 0.87 5000 re 0.81 0.82 0.82 5000 sn 0.89 0.92 0.91 5000 tn 0.91 0.86 0.89 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19450) (46562, 19450) (211402, 19425) (46562, 19425) STARTING UNMASKING ROUND 95 fra cc cxg2 95 precision recall f1-score support be 0.80 0.62 0.70 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.80 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.81 0.82 0.81 5000 sn 0.90 0.92 0.91 5000 tn 0.91 0.86 0.89 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19425) (46562, 19425) (211402, 19400) (46562, 19400) STARTING UNMASKING ROUND 96 fra cc cxg2 96 precision recall f1-score support be 0.79 0.61 0.69 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.79 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.88 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.81 0.82 0.81 5000 sn 0.90 0.92 0.91 5000 tn 0.91 0.86 0.89 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19400) (46562, 19400) (211402, 19374) (46562, 19374) STARTING UNMASKING ROUND 97 fra cc cxg2 97 precision recall f1-score support be 0.80 0.62 0.69 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.79 5000 cm 0.99 0.99 0.99 1996 dz 0.93 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.87 5000 nc 0.87 0.84 0.85 3046 pf 0.87 0.87 0.87 5000 re 0.81 0.82 0.81 5000 sn 0.89 0.92 0.91 5000 tn 0.91 0.85 0.88 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19374) (46562, 19374) (211402, 19348) (46562, 19348) STARTING UNMASKING ROUND 98 fra cc cxg2 98 precision recall f1-score support be 0.80 0.62 0.69 1956 bf 0.93 0.91 0.92 2240 ch 0.78 0.81 0.79 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.87 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.80 0.82 0.81 5000 sn 0.89 0.92 0.91 5000 tn 0.91 0.85 0.88 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19348) (46562, 19348) (211402, 19323) (46562, 19323) STARTING UNMASKING ROUND 99 fra cc cxg2 99 precision recall f1-score support be 0.80 0.61 0.69 1956 bf 0.94 0.91 0.92 2240 ch 0.78 0.81 0.79 5000 cm 0.99 0.99 0.99 1996 dz 0.94 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.88 0.81 0.84 1893 lu 0.86 0.89 0.87 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.80 0.82 0.81 5000 sn 0.90 0.92 0.91 5000 tn 0.91 0.85 0.88 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19323) (46562, 19323) (211402, 19299) (46562, 19299) STARTING UNMASKING ROUND 100 fra cc cxg2 100 precision recall f1-score support be 0.80 0.62 0.69 1956 bf 0.94 0.91 0.92 2240 ch 0.77 0.81 0.79 5000 cm 0.99 0.99 0.99 1996 dz 0.93 0.93 0.93 2929 fr 0.83 0.88 0.86 5000 gd 0.87 0.81 0.84 1893 lu 0.86 0.89 0.87 5000 nc 0.87 0.84 0.86 3046 pf 0.87 0.87 0.87 5000 re 0.80 0.81 0.81 5000 sn 0.89 0.92 0.90 5000 tn 0.91 0.85 0.88 2502 avg / total 0.86 0.86 0.86 46562 Reducing feature vectors. (211402, 19299) (46562, 19299) (211402, 19274) (46562, 19274)