Theoretical Computer Science ( IF 0.747 ) Pub Date : 2020-08-27 , DOI: 10.1016/j.tcs.2020.08.025 Arnab Bhattacharyya; Ameet Gadekar; Ninad Rajgopal
We consider the problem of learning k-parities in the online mistake-bound model: given a hidden vector where the hamming weight of x is k and a sequence of “questions” , where the algorithm must reply to each question with , what is the best trade-off between the number of mistakes made by the algorithm and its time complexity? We improve the previous best result of Buhrman et al.  by an factor in the time complexity.
Next, we consider the problem of learning k-parities in the PAC model in the presence of random classification noise of rate . Here, we observe that even in the presence of classification noise of non-trivial rate, it is possible to learn k-parities in time better than , whereas the current best algorithm for learning noisy k-parities, due to Grigorescu et al. , inherently requires time even when the noise rate is polynomially small.