E:\!CORPORA\CxG-Background-Corpus\!Frontiers>python classify_unmask.py Starting cxg2 and cc (28000, 22628) 28000 STARTING UNMASKING ROUND 1 eng cc cxg2 1 precision recall f1-score support au 0.97 0.96 0.97 5000 ca 0.94 0.94 0.94 5000 ch 0.97 0.94 0.96 2688 gb 0.95 0.95 0.95 5000 ie 0.97 0.97 0.97 5000 in 0.97 0.98 0.97 5000 my 0.96 0.96 0.96 5000 ng 0.98 0.98 0.98 5000 nz 0.91 0.92 0.91 5000 ph 0.98 0.97 0.98 5000 pk 1.00 0.99 0.99 5000 pt 0.99 0.98 0.98 3788 us 0.93 0.95 0.94 5000 za 0.94 0.96 0.95 5000 avg / total 0.96 0.96 0.96 66476 Reducing feature vectors. (322587, 22628) (66476, 22628) (322587, 22609) (66476, 22609) STARTING UNMASKING ROUND 2 eng cc cxg2 2 precision recall f1-score support au 0.97 0.96 0.96 5000 ca 0.94 0.94 0.94 5000 ch 0.97 0.94 0.96 2688 gb 0.94 0.94 0.94 5000 ie 0.96 0.96 0.96 5000 in 0.97 0.97 0.97 5000 my 0.96 0.95 0.96 5000 ng 0.98 0.98 0.98 5000 nz 0.91 0.91 0.91 5000 ph 0.98 0.97 0.98 5000 pk 0.99 0.99 0.99 5000 pt 0.99 0.97 0.98 3788 us 0.93 0.95 0.94 5000 za 0.94 0.95 0.95 5000 avg / total 0.96 0.96 0.96 66476 Reducing feature vectors. (322587, 22609) (66476, 22609) (322587, 22591) (66476, 22591) STARTING UNMASKING ROUND 3 eng cc cxg2 3 precision recall f1-score support au 0.96 0.96 0.96 5000 ca 0.93 0.94 0.93 5000 ch 0.96 0.94 0.95 2688 gb 0.93 0.94 0.93 5000 ie 0.95 0.95 0.95 5000 in 0.96 0.97 0.97 5000 my 0.96 0.95 0.95 5000 ng 0.97 0.98 0.98 5000 nz 0.90 0.90 0.90 5000 ph 0.98 0.96 0.97 5000 pk 0.99 0.99 0.99 5000 pt 0.99 0.97 0.98 3788 us 0.92 0.94 0.93 5000 za 0.94 0.95 0.94 5000 avg / total 0.95 0.95 0.95 66476 Reducing feature vectors. (322587, 22591) (66476, 22591) (322587, 22568) (66476, 22568) STARTING UNMASKING ROUND 4 eng cc cxg2 4 precision recall f1-score support au 0.96 0.95 0.96 5000 ca 0.93 0.93 0.93 5000 ch 0.96 0.94 0.95 2688 gb 0.93 0.93 0.93 5000 ie 0.95 0.95 0.95 5000 in 0.96 0.97 0.97 5000 my 0.96 0.95 0.95 5000 ng 0.97 0.98 0.97 5000 nz 0.88 0.89 0.89 5000 ph 0.98 0.96 0.97 5000 pk 0.99 0.99 0.99 5000 pt 0.99 0.97 0.98 3788 us 0.92 0.94 0.93 5000 za 0.92 0.94 0.93 5000 avg / total 0.95 0.95 0.95 66476 Reducing feature vectors. (322587, 22568) (66476, 22568) (322587, 22544) (66476, 22544) STARTING UNMASKING ROUND 5 eng cc cxg2 5 precision recall f1-score support au 0.96 0.95 0.96 5000 ca 0.93 0.93 0.93 5000 ch 0.96 0.94 0.95 2688 gb 0.92 0.93 0.93 5000 ie 0.95 0.95 0.95 5000 in 0.96 0.96 0.96 5000 my 0.95 0.94 0.95 5000 ng 0.97 0.98 0.97 5000 nz 0.88 0.88 0.88 5000 ph 0.97 0.96 0.97 5000 pk 0.99 0.99 0.99 5000 pt 0.99 0.97 0.98 3788 us 0.92 0.93 0.93 5000 za 0.92 0.94 0.93 5000 avg / total 0.95 0.95 0.95 66476 Reducing feature vectors. (322587, 22544) (66476, 22544) (322587, 22519) (66476, 22519) STARTING UNMASKING ROUND 6 eng cc cxg2 6 precision recall f1-score support au 0.95 0.95 0.95 5000 ca 0.93 0.93 0.93 5000 ch 0.96 0.93 0.95 2688 gb 0.91 0.92 0.92 5000 ie 0.94 0.94 0.94 5000 in 0.95 0.96 0.96 5000 my 0.95 0.94 0.94 5000 ng 0.96 0.97 0.97 5000 nz 0.86 0.86 0.86 5000 ph 0.97 0.96 0.97 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.98 3788 us 0.92 0.93 0.93 5000 za 0.90 0.92 0.91 5000 avg / total 0.94 0.94 0.94 66476 Reducing feature vectors. (322587, 22519) (66476, 22519) (322587, 22494) (66476, 22494) STARTING UNMASKING ROUND 7 eng cc cxg2 7 precision recall f1-score support au 0.95 0.95 0.95 5000 ca 0.92 0.92 0.92 5000 ch 0.96 0.93 0.95 2688 gb 0.91 0.92 0.92 5000 ie 0.95 0.94 0.94 5000 in 0.95 0.96 0.95 5000 my 0.94 0.93 0.94 5000 ng 0.96 0.97 0.97 5000 nz 0.85 0.86 0.86 5000 ph 0.97 0.96 0.97 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.98 3788 us 0.91 0.93 0.92 5000 za 0.89 0.91 0.90 5000 avg / total 0.94 0.94 0.94 66476 Reducing feature vectors. (322587, 22494) (66476, 22494) (322587, 22469) (66476, 22469) STARTING UNMASKING ROUND 8 eng cc cxg2 8 precision recall f1-score support au 0.95 0.95 0.95 5000 ca 0.92 0.92 0.92 5000 ch 0.96 0.92 0.94 2688 gb 0.91 0.91 0.91 5000 ie 0.94 0.94 0.94 5000 in 0.94 0.94 0.94 5000 my 0.94 0.93 0.93 5000 ng 0.96 0.97 0.97 5000 nz 0.85 0.85 0.85 5000 ph 0.97 0.96 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.98 3788 us 0.91 0.93 0.92 5000 za 0.89 0.91 0.90 5000 avg / total 0.94 0.93 0.94 66476 Reducing feature vectors. (322587, 22469) (66476, 22469) (322587, 22449) (66476, 22449) STARTING UNMASKING ROUND 9 eng cc cxg2 9 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.92 0.92 0.92 5000 ch 0.95 0.92 0.94 2688 gb 0.91 0.91 0.91 5000 ie 0.94 0.94 0.94 5000 in 0.93 0.94 0.93 5000 my 0.94 0.92 0.93 5000 ng 0.96 0.97 0.96 5000 nz 0.85 0.85 0.85 5000 ph 0.97 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.98 3788 us 0.91 0.93 0.92 5000 za 0.89 0.91 0.90 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22449) (66476, 22449) (322587, 22422) (66476, 22422) STARTING UNMASKING ROUND 10 eng cc cxg2 10 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.92 0.92 0.92 5000 ch 0.95 0.92 0.93 2688 gb 0.90 0.91 0.91 5000 ie 0.94 0.94 0.94 5000 in 0.93 0.93 0.93 5000 my 0.94 0.92 0.93 5000 ng 0.96 0.97 0.97 5000 nz 0.84 0.84 0.84 5000 ph 0.97 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.89 0.91 0.90 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22422) (66476, 22422) (322587, 22396) (66476, 22396) STARTING UNMASKING ROUND 11 eng cc cxg2 11 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.91 0.92 0.91 5000 ch 0.95 0.92 0.93 2688 gb 0.91 0.91 0.91 5000 ie 0.94 0.94 0.94 5000 in 0.93 0.93 0.93 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.97 5000 nz 0.84 0.84 0.84 5000 ph 0.97 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.88 0.91 0.89 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22396) (66476, 22396) (322587, 22369) (66476, 22369) STARTING UNMASKING ROUND 12 eng cc cxg2 12 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.92 0.91 0.92 5000 ch 0.95 0.91 0.93 2688 gb 0.90 0.91 0.90 5000 ie 0.94 0.94 0.94 5000 in 0.93 0.93 0.93 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.84 0.84 5000 ph 0.97 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.88 0.90 0.89 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22369) (66476, 22369) (322587, 22345) (66476, 22345) STARTING UNMASKING ROUND 13 eng cc cxg2 13 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.92 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.90 0.91 0.90 5000 ie 0.94 0.94 0.94 5000 in 0.92 0.93 0.93 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.83 0.83 5000 ph 0.97 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.87 0.90 0.89 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22345) (66476, 22345) (322587, 22317) (66476, 22317) STARTING UNMASKING ROUND 14 eng cc cxg2 14 precision recall f1-score support au 0.95 0.94 0.95 5000 ca 0.91 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.90 0.90 0.90 5000 ie 0.93 0.93 0.93 5000 in 0.92 0.93 0.92 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.83 0.83 5000 ph 0.96 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.87 0.90 0.89 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22317) (66476, 22317) (322587, 22289) (66476, 22289) STARTING UNMASKING ROUND 15 eng cc cxg2 15 precision recall f1-score support au 0.95 0.94 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.90 0.90 0.90 5000 ie 0.93 0.93 0.93 5000 in 0.92 0.93 0.92 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.83 0.83 5000 ph 0.96 0.95 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.91 0.93 0.92 5000 za 0.87 0.90 0.88 5000 avg / total 0.93 0.93 0.93 66476 Reducing feature vectors. (322587, 22289) (66476, 22289) (322587, 22266) (66476, 22266) STARTING UNMASKING ROUND 16 eng cc cxg2 16 precision recall f1-score support au 0.94 0.94 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.89 0.90 0.90 5000 ie 0.93 0.93 0.93 5000 in 0.92 0.93 0.92 5000 my 0.93 0.92 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.82 0.83 5000 ph 0.96 0.95 0.96 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.90 0.92 0.91 5000 za 0.87 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22266) (66476, 22266) (322587, 22238) (66476, 22238) STARTING UNMASKING ROUND 17 eng cc cxg2 17 precision recall f1-score support au 0.94 0.94 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.89 0.90 0.89 5000 ie 0.93 0.93 0.93 5000 in 0.92 0.93 0.92 5000 my 0.93 0.91 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.82 0.83 5000 ph 0.96 0.94 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.98 0.97 0.97 3788 us 0.90 0.92 0.91 5000 za 0.87 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22238) (66476, 22238) (322587, 22212) (66476, 22212) STARTING UNMASKING ROUND 18 eng cc cxg2 18 precision recall f1-score support au 0.94 0.94 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.95 0.91 0.93 2688 gb 0.89 0.90 0.89 5000 ie 0.93 0.93 0.93 5000 in 0.92 0.93 0.92 5000 my 0.93 0.91 0.92 5000 ng 0.96 0.97 0.96 5000 nz 0.83 0.82 0.82 5000 ph 0.96 0.94 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.97 0.97 3788 us 0.90 0.92 0.91 5000 za 0.87 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22212) (66476, 22212) (322587, 22185) (66476, 22185) STARTING UNMASKING ROUND 19 eng cc cxg2 19 precision recall f1-score support au 0.94 0.94 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.94 0.91 0.92 2688 gb 0.89 0.89 0.89 5000 ie 0.93 0.93 0.93 5000 in 0.91 0.92 0.92 5000 my 0.93 0.91 0.92 5000 ng 0.95 0.96 0.96 5000 nz 0.83 0.82 0.82 5000 ph 0.95 0.94 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.97 3788 us 0.90 0.92 0.91 5000 za 0.87 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22185) (66476, 22185) (322587, 22161) (66476, 22161) STARTING UNMASKING ROUND 20 eng cc cxg2 20 precision recall f1-score support au 0.94 0.93 0.94 5000 ca 0.91 0.90 0.91 5000 ch 0.94 0.90 0.92 2688 gb 0.89 0.89 0.89 5000 ie 0.92 0.92 0.92 5000 in 0.91 0.92 0.92 5000 my 0.93 0.91 0.92 5000 ng 0.95 0.96 0.96 5000 nz 0.83 0.82 0.82 5000 ph 0.95 0.94 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.97 3788 us 0.90 0.92 0.91 5000 za 0.86 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22161) (66476, 22161) (322587, 22133) (66476, 22133) STARTING UNMASKING ROUND 21 eng cc cxg2 21 precision recall f1-score support au 0.94 0.93 0.94 5000 ca 0.91 0.90 0.91 5000 ch 0.94 0.91 0.92 2688 gb 0.89 0.89 0.89 5000 ie 0.92 0.93 0.92 5000 in 0.91 0.92 0.92 5000 my 0.93 0.91 0.92 5000 ng 0.95 0.96 0.96 5000 nz 0.82 0.82 0.82 5000 ph 0.95 0.94 0.95 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.97 3788 us 0.90 0.92 0.91 5000 za 0.86 0.89 0.88 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22133) (66476, 22133) (322587, 22107) (66476, 22107) STARTING UNMASKING ROUND 22 eng cc cxg2 22 precision recall f1-score support au 0.94 0.93 0.94 5000 ca 0.91 0.91 0.91 5000 ch 0.94 0.90 0.92 2688 gb 0.89 0.89 0.89 5000 ie 0.92 0.92 0.92 5000 in 0.91 0.92 0.91 5000 my 0.93 0.91 0.92 5000 ng 0.95 0.96 0.96 5000 nz 0.82 0.81 0.82 5000 ph 0.95 0.94 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.97 3788 us 0.90 0.92 0.91 5000 za 0.86 0.89 0.87 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22107) (66476, 22107) (322587, 22083) (66476, 22083) STARTING UNMASKING ROUND 23 eng cc cxg2 23 precision recall f1-score support au 0.94 0.93 0.93 5000 ca 0.91 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.89 0.89 0.89 5000 ie 0.92 0.92 0.92 5000 in 0.91 0.92 0.91 5000 my 0.93 0.91 0.92 5000 ng 0.95 0.96 0.95 5000 nz 0.82 0.81 0.82 5000 ph 0.95 0.94 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.90 0.92 0.91 5000 za 0.86 0.89 0.87 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22083) (66476, 22083) (322587, 22059) (66476, 22059) STARTING UNMASKING ROUND 24 eng cc cxg2 24 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.88 0.89 0.89 5000 ie 0.92 0.92 0.92 5000 in 0.91 0.92 0.91 5000 my 0.93 0.91 0.92 5000 ng 0.94 0.96 0.95 5000 nz 0.82 0.81 0.82 5000 ph 0.95 0.94 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.90 0.92 0.91 5000 za 0.86 0.89 0.87 5000 avg / total 0.92 0.92 0.92 66476 Reducing feature vectors. (322587, 22059) (66476, 22059) (322587, 22032) (66476, 22032) STARTING UNMASKING ROUND 25 eng cc cxg2 25 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.88 0.89 0.88 5000 ie 0.92 0.91 0.92 5000 in 0.91 0.92 0.91 5000 my 0.93 0.91 0.92 5000 ng 0.94 0.96 0.95 5000 nz 0.82 0.81 0.81 5000 ph 0.95 0.94 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.90 0.91 0.90 5000 za 0.86 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 22032) (66476, 22032) (322587, 22006) (66476, 22006) STARTING UNMASKING ROUND 26 eng cc cxg2 26 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.91 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.88 0.89 0.88 5000 ie 0.92 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.93 0.90 0.92 5000 ng 0.94 0.96 0.95 5000 nz 0.82 0.81 0.81 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 22006) (66476, 22006) (322587, 21979) (66476, 21979) STARTING UNMASKING ROUND 27 eng cc cxg2 27 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.88 0.89 0.88 5000 ie 0.92 0.91 0.91 5000 in 0.91 0.92 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.81 0.81 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21979) (66476, 21979) (322587, 21952) (66476, 21952) STARTING UNMASKING ROUND 28 eng cc cxg2 28 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.90 0.92 2688 gb 0.88 0.88 0.88 5000 ie 0.91 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.81 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21952) (66476, 21952) (322587, 21928) (66476, 21928) STARTING UNMASKING ROUND 29 eng cc cxg2 29 precision recall f1-score support au 0.93 0.92 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.89 0.92 2688 gb 0.88 0.88 0.88 5000 ie 0.91 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.80 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21928) (66476, 21928) (322587, 21903) (66476, 21903) STARTING UNMASKING ROUND 30 eng cc cxg2 30 precision recall f1-score support au 0.93 0.93 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.89 0.91 2688 gb 0.88 0.88 0.88 5000 ie 0.91 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.81 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.89 0.87 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21903) (66476, 21903) (322587, 21877) (66476, 21877) STARTING UNMASKING ROUND 31 eng cc cxg2 31 precision recall f1-score support au 0.93 0.92 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.89 0.91 2688 gb 0.88 0.88 0.88 5000 ie 0.91 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.80 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21877) (66476, 21877) (322587, 21850) (66476, 21850) STARTING UNMASKING ROUND 32 eng cc cxg2 32 precision recall f1-score support au 0.93 0.92 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.94 0.89 0.91 2688 gb 0.88 0.87 0.88 5000 ie 0.91 0.91 0.91 5000 in 0.91 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.80 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.85 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21850) (66476, 21850) (322587, 21824) (66476, 21824) STARTING UNMASKING ROUND 33 eng cc cxg2 33 precision recall f1-score support au 0.93 0.92 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.93 0.89 0.91 2688 gb 0.88 0.87 0.87 5000 ie 0.91 0.91 0.91 5000 in 0.90 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.80 0.80 5000 ph 0.95 0.93 0.94 5000 pk 0.99 0.99 0.99 5000 pt 0.97 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21824) (66476, 21824) (322587, 21797) (66476, 21797) STARTING UNMASKING ROUND 34 eng cc cxg2 34 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.90 0.90 5000 ch 0.93 0.89 0.91 2688 gb 0.88 0.87 0.87 5000 ie 0.91 0.91 0.91 5000 in 0.90 0.91 0.90 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.81 0.79 0.80 5000 ph 0.94 0.93 0.94 5000 pk 0.98 0.99 0.99 5000 pt 0.96 0.96 0.96 3788 us 0.89 0.91 0.90 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21797) (66476, 21797) (322587, 21769) (66476, 21769) STARTING UNMASKING ROUND 35 eng cc cxg2 35 precision recall f1-score support au 0.93 0.92 0.93 5000 ca 0.90 0.90 0.90 5000 ch 0.93 0.89 0.91 2688 gb 0.87 0.87 0.87 5000 ie 0.91 0.91 0.91 5000 in 0.90 0.91 0.91 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.80 0.79 0.80 5000 ph 0.94 0.93 0.94 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.96 0.96 3788 us 0.88 0.91 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21769) (66476, 21769) (322587, 21744) (66476, 21744) STARTING UNMASKING ROUND 36 eng cc cxg2 36 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.90 0.90 5000 ch 0.93 0.88 0.91 2688 gb 0.87 0.87 0.87 5000 ie 0.91 0.91 0.91 5000 in 0.90 0.91 0.90 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.80 0.79 0.80 5000 ph 0.94 0.93 0.94 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.91 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21744) (66476, 21744) (322587, 21717) (66476, 21717) STARTING UNMASKING ROUND 37 eng cc cxg2 37 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.93 0.88 0.91 2688 gb 0.87 0.87 0.87 5000 ie 0.91 0.91 0.91 5000 in 0.90 0.91 0.90 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.80 0.79 0.80 5000 ph 0.94 0.93 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.95 0.96 3788 us 0.88 0.90 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21717) (66476, 21717) (322587, 21690) (66476, 21690) STARTING UNMASKING ROUND 38 eng cc cxg2 38 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.90 5000 ch 0.93 0.88 0.91 2688 gb 0.87 0.87 0.87 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.91 0.90 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.96 0.95 5000 nz 0.80 0.79 0.80 5000 ph 0.94 0.93 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.91 0.91 0.91 66476 Reducing feature vectors. (322587, 21690) (66476, 21690) (322587, 21662) (66476, 21662) STARTING UNMASKING ROUND 39 eng cc cxg2 39 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.93 0.88 0.90 2688 gb 0.87 0.87 0.87 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.91 0.90 5000 my 0.92 0.90 0.91 5000 ng 0.94 0.95 0.95 5000 nz 0.80 0.79 0.80 5000 ph 0.94 0.93 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21662) (66476, 21662) (322587, 21638) (66476, 21638) STARTING UNMASKING ROUND 40 eng cc cxg2 40 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.89 0.89 0.89 5000 ch 0.93 0.88 0.90 2688 gb 0.87 0.87 0.87 5000 ie 0.90 0.91 0.90 5000 in 0.90 0.90 0.90 5000 my 0.91 0.89 0.90 5000 ng 0.94 0.95 0.95 5000 nz 0.80 0.79 0.79 5000 ph 0.94 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.84 0.88 0.86 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21638) (66476, 21638) (322587, 21611) (66476, 21611) STARTING UNMASKING ROUND 41 eng cc cxg2 41 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.93 0.88 0.90 2688 gb 0.87 0.87 0.87 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.90 0.90 5000 my 0.91 0.89 0.90 5000 ng 0.93 0.95 0.94 5000 nz 0.80 0.78 0.79 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21611) (66476, 21611) (322587, 21585) (66476, 21585) STARTING UNMASKING ROUND 42 eng cc cxg2 42 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.93 0.88 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.90 0.90 5000 my 0.91 0.89 0.90 5000 ng 0.93 0.95 0.94 5000 nz 0.80 0.78 0.79 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21585) (66476, 21585) (322587, 21560) (66476, 21560) STARTING UNMASKING ROUND 43 eng cc cxg2 43 precision recall f1-score support au 0.93 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.93 0.88 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.90 0.90 5000 my 0.91 0.89 0.90 5000 ng 0.94 0.95 0.94 5000 nz 0.80 0.77 0.78 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21560) (66476, 21560) (322587, 21536) (66476, 21536) STARTING UNMASKING ROUND 44 eng cc cxg2 44 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.92 0.88 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.90 0.90 5000 my 0.90 0.89 0.90 5000 ng 0.94 0.95 0.94 5000 nz 0.80 0.77 0.79 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.96 0.96 3788 us 0.88 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21536) (66476, 21536) (322587, 21508) (66476, 21508) STARTING UNMASKING ROUND 45 eng cc cxg2 45 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.90 0.89 0.89 5000 ch 0.92 0.88 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.90 0.90 0.90 5000 my 0.90 0.89 0.89 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.95 0.96 3788 us 0.87 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21508) (66476, 21508) (322587, 21484) (66476, 21484) STARTING UNMASKING ROUND 46 eng cc cxg2 46 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.89 0.89 5000 ch 0.92 0.87 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.90 0.88 0.89 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.78 0.79 5000 ph 0.93 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.95 0.96 3788 us 0.87 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21484) (66476, 21484) (322587, 21456) (66476, 21456) STARTING UNMASKING ROUND 47 eng cc cxg2 47 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.89 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.89 0.88 0.88 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.94 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.88 0.90 0.89 5000 za 0.83 0.87 0.85 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21456) (66476, 21456) (322587, 21428) (66476, 21428) STARTING UNMASKING ROUND 48 eng cc cxg2 48 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.89 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.90 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.89 0.88 0.88 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.94 0.92 0.93 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.88 0.89 0.89 5000 za 0.82 0.86 0.84 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21428) (66476, 21428) (322587, 21401) (66476, 21401) STARTING UNMASKING ROUND 49 eng cc cxg2 49 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.89 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.88 0.88 0.88 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.93 0.92 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.88 0.89 0.89 5000 za 0.82 0.86 0.84 5000 avg / total 0.90 0.90 0.90 66476 Reducing feature vectors. (322587, 21401) (66476, 21401) (322587, 21376) (66476, 21376) STARTING UNMASKING ROUND 50 eng cc cxg2 50 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.88 5000 ch 0.92 0.87 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.88 0.88 0.88 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21376) (66476, 21376) (322587, 21351) (66476, 21351) STARTING UNMASKING ROUND 51 eng cc cxg2 51 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.88 5000 ch 0.92 0.87 0.90 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.89 5000 in 0.89 0.89 0.89 5000 my 0.88 0.88 0.88 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21351) (66476, 21351) (322587, 21324) (66476, 21324) STARTING UNMASKING ROUND 52 eng cc cxg2 52 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.90 5000 in 0.89 0.89 0.89 5000 my 0.87 0.88 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.77 0.78 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21324) (66476, 21324) (322587, 21297) (66476, 21297) STARTING UNMASKING ROUND 53 eng cc cxg2 53 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.89 5000 in 0.89 0.89 0.89 5000 my 0.87 0.88 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.79 0.76 0.78 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21297) (66476, 21297) (322587, 21270) (66476, 21270) STARTING UNMASKING ROUND 54 eng cc cxg2 54 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.89 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.86 0.86 5000 ie 0.89 0.90 0.89 5000 in 0.89 0.89 0.89 5000 my 0.87 0.88 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.78 0.77 0.77 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.97 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21270) (66476, 21270) (322587, 21243) (66476, 21243) STARTING UNMASKING ROUND 55 eng cc cxg2 55 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.86 0.85 0.85 5000 ie 0.89 0.90 0.89 5000 in 0.89 0.88 0.89 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.94 0.94 5000 nz 0.78 0.76 0.77 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21243) (66476, 21243) (322587, 21221) (66476, 21221) STARTING UNMASKING ROUND 56 eng cc cxg2 56 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.87 0.89 2688 gb 0.85 0.85 0.85 5000 ie 0.89 0.89 0.89 5000 in 0.89 0.88 0.89 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.94 0.94 5000 nz 0.78 0.76 0.77 5000 ph 0.93 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.86 0.84 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21221) (66476, 21221) (322587, 21197) (66476, 21197) STARTING UNMASKING ROUND 57 eng cc cxg2 57 precision recall f1-score support au 0.92 0.92 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.85 0.85 0.85 5000 ie 0.89 0.89 0.89 5000 in 0.89 0.88 0.88 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.78 0.76 0.77 5000 ph 0.92 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.82 0.85 0.83 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21197) (66476, 21197) (322587, 21175) (66476, 21175) STARTING UNMASKING ROUND 58 eng cc cxg2 58 precision recall f1-score support au 0.92 0.91 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.87 0.89 2688 gb 0.85 0.84 0.84 5000 ie 0.89 0.89 0.89 5000 in 0.89 0.88 0.89 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.77 0.75 0.76 5000 ph 0.92 0.91 0.92 5000 pk 0.98 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.81 0.85 0.83 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21175) (66476, 21175) (322587, 21148) (66476, 21148) STARTING UNMASKING ROUND 59 eng cc cxg2 59 precision recall f1-score support au 0.92 0.91 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.92 0.87 0.89 2688 gb 0.84 0.84 0.84 5000 ie 0.89 0.89 0.89 5000 in 0.89 0.88 0.88 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.95 0.94 5000 nz 0.77 0.75 0.76 5000 ph 0.93 0.91 0.92 5000 pk 0.97 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.81 0.85 0.83 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21148) (66476, 21148) (322587, 21120) (66476, 21120) STARTING UNMASKING ROUND 60 eng cc cxg2 60 precision recall f1-score support au 0.92 0.91 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.87 0.89 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.89 0.88 5000 in 0.89 0.88 0.88 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.94 0.94 5000 nz 0.77 0.75 0.76 5000 ph 0.92 0.91 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.81 0.85 0.83 5000 avg / total 0.89 0.89 0.89 66476 Reducing feature vectors. (322587, 21120) (66476, 21120) (322587, 21093) (66476, 21093) STARTING UNMASKING ROUND 61 eng cc cxg2 61 precision recall f1-score support au 0.92 0.91 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.87 0.89 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.89 0.88 5000 in 0.89 0.88 0.88 5000 my 0.87 0.87 0.87 5000 ng 0.93 0.94 0.93 5000 nz 0.77 0.75 0.76 5000 ph 0.92 0.91 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.96 0.95 0.96 3788 us 0.87 0.89 0.88 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 21093) (66476, 21093) (322587, 21065) (66476, 21065) STARTING UNMASKING ROUND 62 eng cc cxg2 62 precision recall f1-score support au 0.92 0.91 0.92 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.86 0.88 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.89 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.87 0.87 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.75 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.87 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 21065) (66476, 21065) (322587, 21038) (66476, 21038) STARTING UNMASKING ROUND 63 eng cc cxg2 63 precision recall f1-score support au 0.92 0.91 0.91 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.86 0.88 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.88 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 21038) (66476, 21038) (322587, 21010) (66476, 21010) STARTING UNMASKING ROUND 64 eng cc cxg2 64 precision recall f1-score support au 0.92 0.91 0.91 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.86 0.88 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.88 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 21010) (66476, 21010) (322587, 20982) (66476, 20982) STARTING UNMASKING ROUND 65 eng cc cxg2 65 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.86 0.89 2688 gb 0.84 0.84 0.84 5000 ie 0.88 0.89 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20982) (66476, 20982) (322587, 20955) (66476, 20955) STARTING UNMASKING ROUND 66 eng cc cxg2 66 precision recall f1-score support au 0.92 0.91 0.91 5000 ca 0.88 0.88 0.88 5000 ch 0.91 0.86 0.89 2688 gb 0.84 0.83 0.84 5000 ie 0.88 0.89 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20955) (66476, 20955) (322587, 20928) (66476, 20928) STARTING UNMASKING ROUND 67 eng cc cxg2 67 precision recall f1-score support au 0.92 0.91 0.91 5000 ca 0.87 0.88 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.84 0.83 0.83 5000 ie 0.88 0.88 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20928) (66476, 20928) (322587, 20901) (66476, 20901) STARTING UNMASKING ROUND 68 eng cc cxg2 68 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.84 0.83 0.83 5000 ie 0.87 0.88 0.88 5000 in 0.88 0.88 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.74 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20901) (66476, 20901) (322587, 20873) (66476, 20873) STARTING UNMASKING ROUND 69 eng cc cxg2 69 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.88 5000 in 0.88 0.87 0.88 5000 my 0.85 0.86 0.86 5000 ng 0.93 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.98 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.80 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20873) (66476, 20873) (322587, 20845) (66476, 20845) STARTING UNMASKING ROUND 70 eng cc cxg2 70 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.88 5000 in 0.88 0.87 0.88 5000 my 0.86 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.87 0.87 5000 za 0.80 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20845) (66476, 20845) (322587, 20817) (66476, 20817) STARTING UNMASKING ROUND 71 eng cc cxg2 71 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.88 5000 in 0.88 0.87 0.88 5000 my 0.85 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.92 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.91 0.92 3788 us 0.86 0.88 0.87 5000 za 0.80 0.85 0.82 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20817) (66476, 20817) (322587, 20790) (66476, 20790) STARTING UNMASKING ROUND 72 eng cc cxg2 72 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.88 5000 in 0.88 0.87 0.88 5000 my 0.85 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.74 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.90 0.92 3788 us 0.86 0.88 0.87 5000 za 0.81 0.85 0.83 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20790) (66476, 20790) (322587, 20763) (66476, 20763) STARTING UNMASKING ROUND 73 eng cc cxg2 73 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.87 5000 in 0.88 0.87 0.87 5000 my 0.85 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.90 0.91 3788 us 0.85 0.88 0.87 5000 za 0.81 0.85 0.82 5000 avg / total 0.88 0.88 0.88 66476 Reducing feature vectors. (322587, 20763) (66476, 20763) (322587, 20735) (66476, 20735) STARTING UNMASKING ROUND 74 eng cc cxg2 74 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.90 0.86 0.88 2688 gb 0.83 0.83 0.83 5000 ie 0.87 0.88 0.87 5000 in 0.88 0.87 0.87 5000 my 0.86 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.93 0.90 0.91 3788 us 0.86 0.88 0.87 5000 za 0.81 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20735) (66476, 20735) (322587, 20707) (66476, 20707) STARTING UNMASKING ROUND 75 eng cc cxg2 75 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.90 0.86 0.88 2688 gb 0.83 0.82 0.83 5000 ie 0.87 0.88 0.87 5000 in 0.88 0.87 0.87 5000 my 0.86 0.86 0.86 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.90 0.91 3788 us 0.86 0.88 0.87 5000 za 0.81 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20707) (66476, 20707) (322587, 20680) (66476, 20680) STARTING UNMASKING ROUND 76 eng cc cxg2 76 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.82 0.83 5000 ie 0.87 0.88 0.87 5000 in 0.88 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.76 0.73 0.75 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.90 0.91 3788 us 0.85 0.88 0.87 5000 za 0.81 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20680) (66476, 20680) (322587, 20653) (66476, 20653) STARTING UNMASKING ROUND 77 eng cc cxg2 77 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.82 0.83 5000 ie 0.87 0.87 0.87 5000 in 0.88 0.87 0.87 5000 my 0.85 0.86 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.91 0.90 0.91 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.90 0.91 3788 us 0.86 0.88 0.87 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20653) (66476, 20653) (322587, 20625) (66476, 20625) STARTING UNMASKING ROUND 78 eng cc cxg2 78 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.82 0.83 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.91 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.90 0.91 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20625) (66476, 20625) (322587, 20599) (66476, 20599) STARTING UNMASKING ROUND 79 eng cc cxg2 79 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.91 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.90 0.91 3788 us 0.85 0.88 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20599) (66476, 20599) (322587, 20571) (66476, 20571) STARTING UNMASKING ROUND 80 eng cc cxg2 80 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.83 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.91 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.89 0.91 3788 us 0.85 0.88 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20571) (66476, 20571) (322587, 20544) (66476, 20544) STARTING UNMASKING ROUND 81 eng cc cxg2 81 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.91 0.86 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.90 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.89 0.91 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20544) (66476, 20544) (322587, 20518) (66476, 20518) STARTING UNMASKING ROUND 82 eng cc cxg2 82 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.86 0.87 0.87 5000 ch 0.91 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.86 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.90 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.89 0.91 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20518) (66476, 20518) (322587, 20491) (66476, 20491) STARTING UNMASKING ROUND 83 eng cc cxg2 83 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.86 0.87 0.87 5000 ch 0.91 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.86 0.86 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.94 0.93 5000 nz 0.75 0.73 0.74 5000 ph 0.90 0.90 0.90 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20491) (66476, 20491) (322587, 20466) (66476, 20466) STARTING UNMASKING ROUND 84 eng cc cxg2 84 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.86 0.87 0.87 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.87 0.87 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.93 0.93 5000 nz 0.75 0.72 0.74 5000 ph 0.90 0.89 0.89 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20466) (66476, 20466) (322587, 20440) (66476, 20440) STARTING UNMASKING ROUND 85 eng cc cxg2 85 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.87 0.86 0.86 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.93 0.93 5000 nz 0.75 0.72 0.73 5000 ph 0.89 0.88 0.88 5000 pk 0.97 0.98 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20440) (66476, 20440) (322587, 20412) (66476, 20412) STARTING UNMASKING ROUND 86 eng cc cxg2 86 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.87 0.87 0.87 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.85 0.85 0.85 5000 ng 0.92 0.93 0.93 5000 nz 0.75 0.72 0.74 5000 ph 0.89 0.88 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.80 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20412) (66476, 20412) (322587, 20384) (66476, 20384) STARTING UNMASKING ROUND 87 eng cc cxg2 87 precision recall f1-score support au 0.91 0.91 0.91 5000 ca 0.86 0.87 0.86 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.85 0.84 0.85 5000 ng 0.92 0.93 0.93 5000 nz 0.75 0.72 0.73 5000 ph 0.89 0.88 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.84 0.87 0.86 5000 za 0.79 0.84 0.82 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20384) (66476, 20384) (322587, 20358) (66476, 20358) STARTING UNMASKING ROUND 88 eng cc cxg2 88 precision recall f1-score support au 0.91 0.90 0.91 5000 ca 0.86 0.87 0.87 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.87 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.93 5000 nz 0.75 0.72 0.74 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.87 0.87 0.87 66476 Reducing feature vectors. (322587, 20358) (66476, 20358) (322587, 20331) (66476, 20331) STARTING UNMASKING ROUND 89 eng cc cxg2 89 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.87 0.86 5000 ch 0.90 0.85 0.88 2688 gb 0.82 0.82 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.85 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.73 5000 ph 0.89 0.88 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20331) (66476, 20331) (322587, 20303) (66476, 20303) STARTING UNMASKING ROUND 90 eng cc cxg2 90 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.87 0.87 5000 ch 0.91 0.85 0.88 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.74 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.85 0.87 0.86 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20303) (66476, 20303) (322587, 20277) (66476, 20277) STARTING UNMASKING ROUND 91 eng cc cxg2 91 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.87 0.86 0.87 5000 ch 0.90 0.85 0.87 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.74 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.92 0.89 0.90 3788 us 0.84 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20277) (66476, 20277) (322587, 20249) (66476, 20249) STARTING UNMASKING ROUND 92 eng cc cxg2 92 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.87 0.86 0.87 5000 ch 0.90 0.85 0.87 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.73 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.84 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20249) (66476, 20249) (322587, 20221) (66476, 20221) STARTING UNMASKING ROUND 93 eng cc cxg2 93 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.85 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.74 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.84 0.87 0.86 5000 za 0.79 0.84 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20221) (66476, 20221) (322587, 20195) (66476, 20195) STARTING UNMASKING ROUND 94 eng cc cxg2 94 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.73 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.84 0.87 0.86 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20195) (66476, 20195) (322587, 20167) (66476, 20167) STARTING UNMASKING ROUND 95 eng cc cxg2 95 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.85 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.75 0.72 0.73 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20167) (66476, 20167) (322587, 20140) (66476, 20140) STARTING UNMASKING ROUND 96 eng cc cxg2 96 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.85 0.87 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.88 0.87 0.88 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.89 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20140) (66476, 20140) (322587, 20114) (66476, 20114) STARTING UNMASKING ROUND 97 eng cc cxg2 97 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.86 0.86 0.86 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.88 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20114) (66476, 20114) (322587, 20089) (66476, 20089) STARTING UNMASKING ROUND 98 eng cc cxg2 98 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.82 5000 ie 0.86 0.87 0.87 5000 in 0.85 0.86 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.92 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.88 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20089) (66476, 20089) (322587, 20063) (66476, 20063) STARTING UNMASKING ROUND 99 eng cc cxg2 99 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.87 5000 in 0.85 0.85 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.88 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20063) (66476, 20063) (322587, 20035) (66476, 20035) STARTING UNMASKING ROUND 100 eng cc cxg2 100 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.87 5000 in 0.85 0.85 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.88 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.79 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20035) (66476, 20035) (322587, 20007) (66476, 20007) STARTING UNMASKING ROUND 101 eng cc cxg2 101 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.73 5000 ph 0.88 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 20007) (66476, 20007) (322587, 19979) (66476, 19979) STARTING UNMASKING ROUND 102 eng cc cxg2 102 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.73 5000 ph 0.87 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19979) (66476, 19979) (322587, 19951) (66476, 19951) STARTING UNMASKING ROUND 103 eng cc cxg2 103 precision recall f1-score support au 0.91 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.81 0.81 5000 ie 0.86 0.87 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.84 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.73 5000 ph 0.87 0.87 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.81 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19951) (66476, 19951) (322587, 19923) (66476, 19923) STARTING UNMASKING ROUND 104 eng cc cxg2 104 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.80 0.81 5000 ie 0.86 0.87 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.87 0.87 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.80 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19923) (66476, 19923) (322587, 19895) (66476, 19895) STARTING UNMASKING ROUND 105 eng cc cxg2 105 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.82 0.80 0.81 5000 ie 0.86 0.87 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.87 0.87 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.88 0.90 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.80 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19895) (66476, 19895) (322587, 19868) (66476, 19868) STARTING UNMASKING ROUND 106 eng cc cxg2 106 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.90 0.84 0.87 2688 gb 0.81 0.80 0.81 5000 ie 0.86 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.72 0.73 5000 ph 0.87 0.86 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.88 0.89 3788 us 0.84 0.87 0.85 5000 za 0.78 0.83 0.80 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19868) (66476, 19868) (322587, 19844) (66476, 19844) STARTING UNMASKING ROUND 107 eng cc cxg2 107 precision recall f1-score support au 0.90 0.90 0.90 5000 ca 0.86 0.86 0.86 5000 ch 0.89 0.84 0.87 2688 gb 0.81 0.81 0.81 5000 ie 0.86 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.84 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.72 5000 ph 0.87 0.86 0.87 5000 pk 0.97 0.97 0.97 5000 pt 0.91 0.88 0.89 3788 us 0.83 0.87 0.85 5000 za 0.78 0.83 0.80 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19844) (66476, 19844) (322587, 19818) (66476, 19818) STARTING UNMASKING ROUND 108 eng cc cxg2 108 precision recall f1-score support au 0.90 0.89 0.90 5000 ca 0.85 0.86 0.86 5000 ch 0.89 0.84 0.87 2688 gb 0.81 0.81 0.81 5000 ie 0.86 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.83 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.72 5000 ph 0.87 0.87 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.87 0.89 3788 us 0.83 0.86 0.85 5000 za 0.78 0.82 0.80 5000 avg / total 0.86 0.86 0.86 66476 Reducing feature vectors. (322587, 19818) (66476, 19818) (322587, 19791) (66476, 19791) STARTING UNMASKING ROUND 109 eng cc cxg2 109 precision recall f1-score support au 0.90 0.89 0.90 5000 ca 0.85 0.86 0.85 5000 ch 0.90 0.84 0.87 2688 gb 0.81 0.80 0.81 5000 ie 0.86 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.83 5000 ng 0.91 0.93 0.92 5000 nz 0.73 0.71 0.72 5000 ph 0.87 0.86 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.87 0.89 3788 us 0.83 0.86 0.85 5000 za 0.78 0.82 0.80 5000 avg / total 0.86 0.86 0.85 66476 Reducing feature vectors. (322587, 19791) (66476, 19791) (322587, 19765) (66476, 19765) STARTING UNMASKING ROUND 110 eng cc cxg2 110 precision recall f1-score support au 0.90 0.89 0.90 5000 ca 0.85 0.85 0.85 5000 ch 0.89 0.84 0.86 2688 gb 0.81 0.80 0.81 5000 ie 0.85 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.84 0.83 0.83 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.72 5000 ph 0.87 0.86 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.87 0.89 3788 us 0.83 0.86 0.85 5000 za 0.78 0.82 0.80 5000 avg / total 0.85 0.85 0.85 66476 Reducing feature vectors. (322587, 19765) (66476, 19765) (322587, 19737) (66476, 19737) STARTING UNMASKING ROUND 111 eng cc cxg2 111 precision recall f1-score support au 0.90 0.89 0.90 5000 ca 0.85 0.86 0.85 5000 ch 0.89 0.84 0.86 2688 gb 0.81 0.80 0.80 5000 ie 0.85 0.86 0.86 5000 in 0.85 0.85 0.85 5000 my 0.83 0.83 0.83 5000 ng 0.91 0.93 0.92 5000 nz 0.73 0.71 0.72 5000 ph 0.87 0.87 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.87 0.89 3788 us 0.83 0.86 0.85 5000 za 0.78 0.82 0.80 5000 avg / total 0.85 0.85 0.85 66476 Reducing feature vectors. (322587, 19737) (66476, 19737) (322587, 19709) (66476, 19709) STARTING UNMASKING ROUND 112 eng cc cxg2 112 precision recall f1-score support au 0.90 0.89 0.89 5000 ca 0.85 0.85 0.85 5000 ch 0.89 0.84 0.86 2688 gb 0.81 0.80 0.80 5000 ie 0.85 0.86 0.86 5000 in 0.84 0.84 0.84 5000 my 0.83 0.83 0.83 5000 ng 0.91 0.93 0.92 5000 nz 0.74 0.71 0.72 5000 ph 0.87 0.86 0.87 5000 pk 0.96 0.97 0.97 5000 pt 0.91 0.87 0.89 3788 us 0.83 0.86 0.85 5000 za 0.78 0.82 0.80 5000 avg / total 0.85 0.85 0.85 66476 Reducing feature vectors. (322587, 19709) (66476, 19709) (322587, 19681) (66476, 19681)