Talk:Probably approximately correct learning

Latest comment: 1 year ago by Inlin in topic Question: What is a PAC setting?

>One concept is the set of all patterns of bits in that encode a picture of the letter "P".

Shouldn't this be a concept class per the definitions in the article? 2001:6B0:1:1DF0:64E7:CCAC:65E3:5DEC (talk) 13:09, 28 October 2013 (UTC)Reply

The description in the article is correct. While a concept is a subset of , a concept class is a subset of the power set , that is, a family of sets over . A concept class may contain more than one concept, but a concept itself also may consist of more than one instance.
-- 62.117.14.26 (talk) 21:30, 14 February 2014 (UTC)Reply

Improvement opportunities

edit
  1. The examples stop after "concept" for the interval problem, and after "concept class" for the character recognition problem.
  2. It would be interesting to talk about the history of PAC learning and its extensions, and its applications, i.e. when was the framework used to explain empirical observations, and when was it used to analyze and/or derive novel learning techniques?

--zeno (talk) 15:59, 29 March 2019 (UTC)Reply

It would also be really informative to see a demonstration of how the lower bound on sample size varies as a function of error, confidence, and hypothesis complexity. Valiant's original paper has some elementary and fairly concrete examples using simple Boolean conjunctions.[1] Benjamin Schulz (talk) 17:36, 12 April 2020 (UTC)Reply

References

  1. ^ L. Valiant. A theory of the learnable. Communications of the ACM, 27, 1984.

Question: What is a PAC setting?

edit

A training set is e-representative if for each hypothesis function, the absolute difference between the loss function over the algorithm and the minimum loss function is less than e. How to define the term "PAC setting" using the previous information? Inlin (talk) 18:09, 22 July 2023 (UTC)Reply